Pages in this blog

Saturday, July 29, 2023

Current LLMs can be tricked in ways that would never trick a real human

Monday, July 24, 2023

Open Source LLMs

Friday, July 21, 2023

What kind of smart device passes AP exams but can't play a decent game of tic-tac-toe?

Thursday, July 20, 2023

Strangler Fig Tree

Monday, July 17, 2023

A speculative look at the current AI business ecology

Who owns the gates to culture and knowledge?

Julia Angwin, The Gatekeepers of Knowledge Don’t Want Us to See What They Know, NYTimes, July 14, 2023. From the article:

We are living through an information revolution. The traditional gatekeepers of knowledge — librarians, journalists and government officials — have largely been replaced by technological gatekeepers — search engines, artificial intelligence chatbots and social media feeds.

Whatever their flaws, the old gatekeepers were, at least on paper, beholden to the public. The new gatekeepers are fundamentally beholden only to profit and to their shareholders.

That is about to change, thanks to a bold experiment by the European Union.

With key provisions going into effect on Aug. 25, an ambitious package of E.U. rules, the Digital Services Act and Digital Markets Act, is the most extensive effort toward checking the power of Big Tech (beyond the outright bans in places like China and India). For the first time, tech platforms will have to be responsive to the public in myriad ways, including giving users the right to appeal when their content is removed, providing a choice of algorithms and banning the microtargeting of children and of adults based upon sensitive data such as religion, ethnicity and sexual orientation. The reforms also require large tech platforms to audit their algorithms to determine how they affect democracy, human rights and the physical and mental health of minors and other users.

This will be the first time that companies will be required to identify and address the harms that their platforms enable. To hold them accountable, the law also requires large tech platforms like Facebook and Twitter to provide researchers with access to real-time data from their platforms. But there is a crucial element that has yet to be decided by the European Union: whether journalists will get access to any of that data.

Do AI companies have the right to harvest anything anyone uploads to the web and use it for training their engines? [& invention?]

Sheera Frenkel and Stuart A. Thompson, ‘Not for Machines to Harvest’: Data Revolts Break Out Against A.I., NYTimes, July 15, 2023. From the article:

Ms. Loffstadt also helped organize an act of rebellion last month against A.I. systems. Along with dozens of other fan fiction writers, she published a flood of irreverent stories online to overwhelm and confuse the data-collection services that feed writers’ work into A.I. technology.

“We each have to do whatever we can to show them the output of our creativity is not for machines to harvest as they like,” said Ms. Loffstadt, a 42-year-old voice actor from South Yorkshire in Britain.

Fan fiction writers are just one group now staging revolts against A.I. systems as a fever over the technology has gripped Silicon Valley and the world. In recent months, social media companies such as Reddit and Twitter, news organizations including The New York Times and NBC News, authors such as Paul Tremblay and the actress Sarah Silverman have all taken a position against A.I. sucking up their data without permission.

Their protests have taken different forms. Writers and artists are locking their files to protect their work or are boycotting certain websites that publish A.I.-generated content, while companies like Reddit want to charge for access to their data. At least 10 lawsuits have been filed this year against A.I. companies, accusing them of training their systems on artists’ creative work without consent. This past week, Ms. Silverman and the authors Christopher Golden and Richard Kadrey sued OpenAI, the maker of ChatGPT, and others over A.I.’s use of their work. [...]

“The data rebellion that we’re seeing across the country is society’s way of pushing back against this idea that Big Tech is simply entitled to take any and all information from any source whatsoever, and make it their own,” said Ryan Clarkson, the founder of Clarkson.

Eric Goldman, a professor at Santa Clara University School of Law, said the lawsuit’s arguments were expansive and unlikely to be accepted by the court. But the wave of litigation is just beginning, he said, with a “second and third wave” coming that would define A.I.’s future.

What about A.I. invention? See Steve Lohr, Can A.I. Invent? NYTimes, July 15, 2023.

Breathing in waves: Understanding respiratory-brain coupling as a gradient of predictive oscillations

Note that breathing involves both top-down voluntary control and bottom-up involuntary control. 

From the linked article:

Highlights

  • Respiration fundamentally influences neural oscillations in animals and humans.
  • Neuropsychiatric disorders are characterised by specific oscillatory profiles.
  • Here, respiratory and neural aberrations are integrated to explain psychopathology.
  • We propose a gradient model of respiratory-modulated prediction errors.

Abstract

Breathing plays a crucial role in shaping perceptual and cognitive processes by regulating the strength and synchronisation of neural oscillations. Numerous studies have demonstrated that respiratory rhythms govern a wide range of behavioural effects across cognitive, affective, and perceptual domains. Additionally, respiratory-modulated brain oscillations have been observed in various mammalian models and across diverse frequency spectra. However, a comprehensive framework to elucidate these disparate phenomena remains elusive. In this review, we synthesise existing findings to propose a neural gradient of respiratory-modulated brain oscillations and examine recent computational models of neural oscillations to map this gradient onto a hierarchical cascade of precision-weighted prediction errors. By deciphering the computational mechanisms underlying respiratory control of these processes, we can potentially uncover new pathways for understanding the link between respiratory-brain coupling and psychiatric disorders.

The neural basis of color perception

Friday, July 7, 2023

Rethinking Covid: A COVID Dissenter Speaks Out

Glenn Loury & Jay Bhattacharya | The Glenn Show

0:00 The problem with scientific consensus
6:36 Why Jay and his colleagues were branded “fringe epidemiologists”
15:52 Jay: We need to engage with everyone—even those with mistaken beliefs
25:55 Persuading science skeptics
36:04 How do we stop COVID overreach from happening again?
46:38 Jay: Gain-of-function research is impossible to do safely v 55:03 Are some ideas too dangerous to test?
59:30 Jay: Fauci’s blunder was so catastrophic that only history can judge him

Glenn Loury and Jay Battacharya (Stanford, The Illusion of Consensus). Recorded June 23, 2023.

Monday, July 3, 2023

Sunday, July 2, 2023

Insect conscsiousness

Fruit-fly connectome