It’s easy to fear that machines are taking over: companies like IBM and British telecoms company BT have cited AI as a reason to reduce the number of people, and new tools like ChatGPT and DALL-E are making it possible for anyone to understand the extraordinary capabilities of AI for themselves. One recent study by researchers at OpenAI (the startup behind ChatGPT) and the University of Pennsylvania concluded that for about 80 percent of jobs, at least 10 percent of tasks can be automated using the technology behind these tools.
said Erik Brynjolfsson, a professor at the Stanford Institute for Human-Centered Artificial Intelligence.
But he said this is not the only option. “The other thing I wish people would do more is think about new things that can be done now that haven’t been done before. Obviously, that’s a much harder question.” “It’s where the most value is,” he added.
How technology makers design and use business leaders and policymakers organize AI tools will determine how generative AI affects jobs, Brynjolfsson and other economists say. Not all options are necessarily bleak for workers.
AI can complement human work rather than replace it. A lot of companies are using AI to automate call centers, for example. But the Fortune 500 business software provider has instead used a tool like ChatGPT to provide its workers with live suggestions on how to respond to customers. In a study, Brynjolfsson and colleagues compared call center employees who used the tool with those who did not. They found that the tool boosted productivity by 14 percent on average, with most of the gains being made by low-skilled workers. Customer morale was also higher and employee turnover decreased in the group that used the tool.
David Autor, professor of economics at the Massachusetts Institute of Technology, said AI could be used to provide “immediate experience” in jobs such as health care delivery, software development, law, and skilled repair. “It provides an opportunity to enable more workers to do valuable work that builds on some of that expertise,” he said.
Workers can focus on different tasks. As ATMs automate the tasks of disbursing cash and taking deposits, the number of bank tellers has increased, according to an analysis by James Bissen, a researcher at Boston University School of Law. This was in part because while bank branches required fewer workers, they became cheaper to open—and banks opened more of them. But Banks also changed job descriptions. After ATMs, tellers focused less on counting cash and more on building relationships with customers, to whom they sold products such as credit cards. Few jobs can be fully automated by generative AI, but using an AI tool for some tasks may free up workers to extend their work on tasks that cannot be automated.
New technology can lead to new jobs. Agriculture employed approximately 42 percent of the labor force in 1900, but due to automation and advances in technology, it accounted for only 2 percent by 2000. The massive reduction in agricultural jobs has not resulted in widespread unemployment. Instead, technology has created a lot of new jobs. No farmer in the early 20th century could have imagined computer coding, genetic engineering, or trucking. In an analysis that used census data, Autor and his colleagues found that 60 percent of current occupational majors did not exist 80 years ago.
Of course, there is no guarantee that workers will be qualified for new jobs, or that they will be good jobs. None of this just happened, said Daron Acemoglu, professor of economics at MIT and co-author of Power and Progress: Our 1,000-Year Struggle Over Technology and Prosperity.
“If we make the right choices, we create new types of jobs, which is critical for wage growth and also to really reap the productivity benefits,” Acemoglu said. “But if we don’t make the right choices, this won’t happen very often.” Sarah Kessler
In case you missed it yourself
Martha’s behavior model. Entrepreneur Martha Stewart becomes the oldest person to appear on the cover of this week’s Sports Illustrated swimsuit issue. Stewart, 81, told The Times it was “a huge challenge” to have the confidence to roll it out, but two months of Pilates has helped. She’s not the first person over 60 to have this distinction: May Musk, Elon Musk’s mother, graced the cover last year at the age of 74.
TikTok block. Montana has become the first state to ban the Chinese short-form video app, preventing app stores from offering TikTok within its borders as of January 1. The ban is expected to be difficult to enforce, and TikTok users in the state have sued the government, saying the measure violates their First Amendment rights and gives a glimpse into the potential backlash if the federal government tries to ban TikTok nationwide.
Bank blame game. Greg Baker, former CEO of a Silicon Valley bank, blamed “rumors and misconceptions” for the deposit scramble in his first public comments since the bank’s collapse in March. Baker and former top executives at the failed Signature Bank also told a Senate committee investigating their role in the banks’ collapse that they would not return millions of dollars in salary.
A Brief History of Technology CEOs Looking for Strings
When OpenAI CEO Sam Altman testified in Congress this week and called for regulation of generative AI, some lawmakers hailed it as a “historic” move. In fact, asking lawmakers for new rules is a direct step out of the hands of the tech industry. Silicon Valley’s most powerful CEOs have long gone to Washington to prove they stick to the rules in an effort to shape them while simultaneously unleashing some of the world’s most powerful transformative technologies nonstop.
One reason: The federal rule is much easier to manage than different regulations in different states, Bruce Mehlmann, a political consultant and former technology policy official in the Bush administration, told DealBook. He added that clear regulations give investors more confidence in a sector.
The strategy seems logical, but if history is any helpful guide, the reality may be messier than the rhetoric:
In December 2021, Sam Bankman-Fried, founder of failed cryptocurrency exchange FTX, was one of six executives to testify about digital assets in the House of Representatives and call for regulatory clarity. He told lawmakers his company had just submitted a proposal for a “unified joint system”. A year later, Bankman-Fried’s business went bankrupt, and he was facing criminal fraud and charges of illegal campaign contribution.
In 2019, Facebook founder Mark Zuckerberg wrote an op-ed for The Washington Post titled “The Internet Needs New Rules,” based on failures in the company’s content management, election integrity, privacy, and data management. Two years later, independent researchers found that misinformation was more prevalent on the platform than it was in 2016, even though the company has spent billions trying to crack it down.
In 2018, Apple chief Tim Cook said he hated regulation in general but supported stricter data privacy rules, saying, “It’s time for a group of people to think about what they can do.” But to maintain its business in China, one of its largest markets, Apple has largely ceded control of customer data to the government as part of its requirements to operate there.
Buzzword of the week: “Algospeak”
Platforms like TikTok, Facebook, Instagram, and Twitter use algorithms to identify and moderate problematic content. To avoid these digital intermediaries and allow free exchange on taboo topics, a language code has been developed. It’s called “algospeak”.
“There’s a linguistic arms race raging on the Internet – and it’s not clear who’s winning,” wrote Roger J. Cruz, a professor of psychology at the University of Memphis. Posts about sensitive issues such as politics, sex, or suicide can be flagged and removed by algorithms, resulting in the use of creative misspellings and additional attitudes, such as “seggs” and “mascara” for sex, “unalive” for death and “galore.” There is a history of responding to taboos with ciphers, Cruz notes, such as rhyming Cockney slang in 19th century England or “Isopian,” an allegorical language used to circumvent censorship in Tsarist Russia.
Algorithms aren’t the only ones that don’t catch code. Euphemisms and misspellings are especially ubiquitous among marginalized communities. But the hidden language is sometimes far from humans, which leads to potential misunderstandings on the Internet. In February, star Julia Fox found herself in an awkward discussion with a sexual assault victim after a post misunderstood about “mascara” and had to issue a public apology for inappropriately responding to what she thought was a discussion about makeup.
Thanks for reading!
We like your feedback. Please email ideas and suggestions to [email protected].
#Optimists #Guide #Artificial #Intelligence #Action