Close Menu
  • Home
  • Business News
    • Entrepreneurship
  • Investments
  • Markets
  • Opinion
  • Politics
  • Startups
    • Stock Market
  • Trending
    • Technology
  • Online Jobs

Subscribe to Updates

Subscribe to our newsletter and never miss our latest news

Subscribe my Newsletter for New Posts & tips Let's stay updated!

What's Hot

Tech Entrepreneurship: Eliminating waste and eliminating scarcity

July 17, 2024

AI for Entrepreneurs and Small Business Owners

July 17, 2024

Young Entrepreneurs Succeed in Timor-Leste Business Plan Competition

July 17, 2024
Facebook X (Twitter) Instagram
  • Home
  • Business News
    • Entrepreneurship
  • Investments
  • Markets
  • Opinion
  • Politics
  • Startups
    • Stock Market
  • Trending
    • Technology
  • Online Jobs
Facebook X (Twitter) Instagram Pinterest
Prosper planet pulse
  • Home
  • Privacy Policy
  • About us
    • Advertise with Us
  • AFFILIATE DISCLOSURE
  • Contact
  • DMCA Policy
  • Our Authors
  • Terms of Use
  • Shop
Prosper planet pulse
Home»Technology»How to maintain moral authority over AI technology
Technology

How to maintain moral authority over AI technology

prosperplanetpulse.comBy prosperplanetpulse.comJuly 12, 2024No Comments7 Mins Read0 Views
Share Facebook Twitter Pinterest LinkedIn Tumblr Email
Share
Facebook Twitter LinkedIn Pinterest Email


Madhumita Murgia is an award-winning Indian-British journalist and commentator who writes about the impact of technology and science on society. Financial Times Based in London, David leads global coverage of artificial intelligence and other emerging technologies. For the past decade he has travelled the globe writing about the people, startups and companies shaping cutting-edge technology for the following publications: Wired, The Washington Post, Newsweekand TelegraphMurja co-hosts the podcast series Tech Tonic.

Below, Madhumita shares five key insights from her new book. Code Addiction: Living in the Shadow of AIListen to the audio version recited by Madhumita herself on the Next Big Idea app.

Code Addiction Madhumita Murguia Next Generation Big Idea Club

https://cdn.nextbigideaclub.com/wp-content/uploads/2024/07/12120341/BB_Madhumita-Murgia_MIX.mp3?_=1

1. Human stories matter.

In a decade of reporting on AI, there have been very few stories about real people affected by AI technology or the consequences of their interactions with it, and those I read were almost entirely from outside the Western world.

I’ve spent much of my career as a journalist chronicling the fortunes of giants like Google and Meta, companies that refine the vast amounts of data that flow into their platforms — data generated by billions of people around the world — and turn it into products. Story after story, I’ve come to understand how a handful of Silicon Valley engineers have had an outsized impact on billions of people far and wide around the world, shaping everything from personal relationships to the idea of ​​democracy. Code Dependencies This work isn’t about the creators of these technologies, but about ordinary people whose daily lives have been transformed by AI software: gig workers, doctors, poets, parents, activists and refugees, with stories from nine countries, from the US and UK to China, India, Kenya and Argentina.

Code Dependencies This book is my attempt to give readers a tangible glimpse into the invisible world of human-AI encounters, and I hope it will deepen their understanding of the reality of AI and open up important conversations about how we will coexist with this powerful technology.

2. Put the spotlight on people in the shadows.

When I began researching this book, I was skeptical about whose stories I would choose. I was generally looking for nuanced encounters between people and AI around the world. But as I continued to research, it became clear that many of the stories I uncovered belonged to people on the margins in Kenya and Bulgaria: people of color, the poor, women, immigrants, refugees, and religious minorities. I spent time with the community of data labelers, a hidden labor force from the Global South who helped train and shape the AI ​​systems we interact with through Meta, OpenAI, Walmart, and Tesla.

“These stories show what happens when societal biases are woven into AI software.”

In Amsterdam, immigrant single mother Diana Geo finds her sons on a list drawn up by an algorithm that predicts they will commit serious crimes in the future. Those who contribute behind the scenes to AI and those who are negatively affected by it are often marginalized, while the benefits brought by the technology are often enjoyed by the privileged majority. These stories show what happens when social biases are woven into AI software. Technology that devalues ​​human agency and is implemented without the contribution of the communities it affects has major consequences.

3. Technosolutionism (the idea that technology can solve everything) is a myth.

AI and technology entrepreneurs see AI as a solution to society’s biggest challenges, including disease, climate change and poverty, but this thinking often hits a dead end.

For example, in Salta, a small town in northern Argentina, city officials partnered with Microsoft to create a predictive algorithm that estimated (with 86% accuracy) which teenage girls would become pregnant in a low-income community on the city’s outskirts. The AI ​​system ultimately identified that one in three households was at risk of teen pregnancy, indicating the need for community-wide interventions, such as increased investment in social support and employment opportunities. It didn’t need to collect intrusive data from girls in the area on an intermittent basis to prove it.

4. The importance of human action.

One of the most heartwarming stories I discovered was speaking with Dr. Ashita Singh, a physician who works with the indigenous people known as the Bhil tribe in rural India. She was providing medical care to those who needed it most. Dr. Singh was also helping test and develop an AI system that can diagnose tuberculosis just by looking at an x-ray image.

Dr. Singh taught us about the importance of preserving human agency and expertise as we automate the most human parts of society, such as criminal justice, education, and healthcare. She showed us that the work of the best physicians should be enhanced, not replaced. While AI technologies can be used to fill in the gaps, the human aspect of care — being able to listen, empathize, and speak to patients — cannot be replaced by technology.

“She showed us that the work of our best physicians should be enhanced, not replaced.”

It also highlights the importance of accountability, i.e., the question of who is ultimately responsible. When an AI system malfunctions and makes errors that are expected of a statistical system, it is important to identify who is ultimately responsible for the damage that system error causes.

AI can weaken human agency. As I researched stories about deepfakes, and deepfake porn in particular, I heard stories from Helen Mort, a poet in the north of England, and Noel Martin, a student in Sydney, Australia, about the profound impact deepfake porn has had on their lives. The lack of human agency to right the harms of deepfakes left them feeling not just vulnerable, but weak. But these stories also show how human agency can be strengthened, especially in collectives. Looking at gig work, for example, we see unions and activists banding together to fight faceless AI systems.

5. Data colonialism and global participation.

In a new era of data colonialism, a few powerful players are able to exploit and extract data. They do this using data from people in the Global South (or other poor parts of the world) as well as people in Western countries.

For example, many creators have built AI generation systems by stripping away the text, music, images, and art they have written. The concept of “data colonialism” proposed by scholar Nick Cadeley shows that in the age of AI, the amount of data extracted from people all over the world will increase, leading to the concentration of power, and at the same time, it sounds a warning about how we should tackle these issues.

The biggest lesson I learned is that the more diverse voices we include (including people from non-Western countries, religious minorities, immigrants, rural and urban residents), the more inclusive and positive the outcomes of AI systems can be. My quest to explore how AI has transformed people’s lives has also transformed me. The most inspiring parts of the stories I discovered were not the sophisticated algorithms and their outputs, but the humans using and adapting to the technology: the doctors, scientists, gig workers, and creators who represent the best of us. I am optimistic about the potential value of AI, but I believe that no matter how great a tool it is, it will only be useful if human dignity is preserved. My ultimate hope for AI is not that it will create a new, upgraded species, but that it will help ordinary, flawed humans live their best and happiest lives.

To listen to the audio version by author Madhumita Murgia, download the Next Big Idea app now.

Hear key insights into the next big idea app



Source link

Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
prosperplanetpulse.com
  • Website

Related Posts

Technology

Empowered Funds LLC Increases Holdings in Micron Technology, Inc. (NASDAQ:MU)

July 14, 2024
Technology

Portland Film, Animation and Technology Festival Returns with Over 100 Films

July 14, 2024
Technology

Quest from the infinite stairs

July 14, 2024
Technology

Intel and State of Oregon Advance National Semiconductor Technology Center

July 14, 2024
Technology

Leveraging Technology to Boost Malaysia’s Sports Economy – OpEd – Eurasia Review

July 14, 2024
Technology

Digital technology can help avoid medical malpractice lawsuits: Judge Madhav Devi

July 14, 2024
Add A Comment
Leave A Reply Cancel Reply

Subscribe to News

Subscribe to our newsletter and never miss our latest news

Subscribe my Newsletter for New Posts & tips Let's stay updated!

Editor's Picks

The rule of law is more important than feelings about Trump | Opinion

July 15, 2024

OPINION | Biden needs to follow through on promise to help Tulsa victims

July 15, 2024

Opinion | Why China is off-limits to me now

July 15, 2024

Opinion | Fast food chains’ value menu wars benefit consumers

July 15, 2024
Latest Posts

ATLANTIC-ACM Announces 2024 U.S. Business Connectivity Service Provider Excellence Awards

July 10, 2024

Costco’s hourly workers will get a pay raise. Read the CEO memo.

July 10, 2024

Why a Rockland restaurant closed after 48 years

July 10, 2024

Stay Connected

Twitter Linkedin-in Instagram Facebook-f Youtube

Subscribe