Scale AI, valued at $13.8 billion in the last year, has encountered its third lawsuit in just over a month, revolving around alleged labor practices. This time around, the company’s workers contend that they incurred emotional trauma due to reviewing disturbing content without proper safeguards.
Historically, Scale relies on contractor-termed personnel to task-rate the responses of their AI models. Merely this month, a previous employee sued, advocating that she was poorly compensated below the minimum wage and misplaced as a contractor.
This recent complaint – a class-action filed in Northern California’s District – emphasizes the emotional harms allegedly endured by six people working on Scale’s Outlier platform. The individuals claim being obligated to compose unsettling prompts concerning violence and abuse, sans adequate emotional support. They assert being misled about the job’s nature during the recruiting process and eventually developed mental health complications such as PTSD from their work.
Steve McKinney, one of the plaintiffs and lead contender in the separate December 2024 complaint against Scale, is represented by California’s Clarkson Law Firm—the same legal support for both complaints. Despite previous allegations against tech giants like OpenAI and Microsoft, Clarkson Law Firm fell short, criticized for lengthy complaint fillings and irrelevant content allegations.
However, Scale is dauntless in defending their operations and dismissed Clarkson’s attempted legal claims against innovative tech companies. Joe Osborne, a Scale AI spokesperson, conveyed that they adhere to all laws and house several safeguards to protect their contributors, including the power to opt-out at any time, preliminary alerts about sensitive content, and wellness program accessibility.
In rejoinder, Glenn Danas, a Clarkson Law Firm partner, emphasized that Scale AI should be held responsible, alleging their failure in ensuring a safe working environment and obliging employees to view violent content for AI training.
Original source: Read the full article on TechCrunch