26 Apr 2026

feedHacker News

At least 10 people tied to sensitive US research have died or disappeared

Comments

26 Apr 2026 12:55pm GMT

feedSlashdot

Privacy Advocate Accuses US Government of Investing in AI-Powered Mass Surveillance

The Conversation published this warning from privacy/tech law/electronic surveillance attorney Anne Toomey McKenna (also an affiliated faculty member at Penn State's Institute for Computational and Data Sciences). The U.S. government "is able to purchase Americans' sensitive data because the information it buys is not subject to the same restrictions as information it collects directly. The federal government is also ramping up its abilities to directly collect data through partnerships with private tech companies. These surveillance tech partnerships are becoming entrenched, domestically and abroad, as advances in AI take surveillance to unprecedented levels... " Congressional funding is supercharging huge government investments in surveillance tech and data analytics driven by AI, which automates analysis of very large amounts of data. The massive 2025 tax-and-spending law netted the Department of Homeland Security an unprecedented US$165 billion in yearly funding. Immigration and Customs Enforcement, part of DHS, got about $86 billion. Disclosure of documents allegedly hacked from Homeland Security reveal a massive surveillance web that has all Americans in its scope. DHS is expanding its AI surveillance capabilities with a surge in contracts to private companies. It is reportedly funding companies that provide more AI-automated surveillance in airports; adapters to convert agents' phones into biometric scanners; and an AI platform that acquires all 911 call center data to build geospatial heat maps to predict incident trends. Predicting incident trends can be a form of predictive policing, which uses data to anticipate where, when and how crime may occur... Meanwhile, the Trump administration's national policy framework for artificial intelligence, released on March 20, 2026, urges Congress to use grants and tax incentives to fund "wider deployment of AI tools across American industry" and to allow industry and academia to use federal datasets to train AI. Using federal datasets this way raises privacy law concerns because they contain a lifetime of sensitive details about you, including biographical, employment and tax information.... The author argues that it's now critical for Americans to know "why the laws you might think are protecting your data do not apply or are ignored." On March 18, 2026, FBI Director Kash Patel confirmed to Congress that the FBI is buying Americans' data from data brokers, including location histories, to track American citizens.... But in buying your data in bulk on the commercial market, the government is circumventing the Constitution, Supreme Court decisions and federal laws designed to protect your privacy from unwarranted government overreach... Supreme Court cases require police to get a warrant to search a phone or use cellular or GPS location information to track someone. The Electronic Communications Privacy Act's Wiretap Act prohibits unauthorized interception of wire, oral and electronic communications. Despite some efforts, Congress has failed to enact legislation to protect data privacy, the use of sensitive data by AI systems or to restore the intent of the Electronic Communications Privacy Act. Courts have allowed the broad electronic privacy protections in the federal Wiretap Act to be eviscerated by companies claiming consent. In my opinion, the way to begin to address these problems is to restore the Wiretap Act and related laws to their intended purposes of protecting Americans' privacy in communications, and for Congress to follow through on its promises and efforts by passing legislation that secures Americans' data privacy and protects them from AI harms. Thanks to long-time Slashdot reader sinij for sharing the article.

Read more of this story at Slashdot.

26 Apr 2026 11:34am GMT

feedHacker News

Asahi Linux Progress Linux 7.0

Comments

26 Apr 2026 10:50am GMT

Statecharts: hierarchical state machines

Comments

26 Apr 2026 9:32am GMT

feedSlashdot

40 Years After the Chernobyl Disaster, More Countries Are Turning To Nuclear Power

An anonymous reader shared this report from the Associated Press: The 1986 Chernobyl disaster fueled global fears about nuclear power and slowed its development in Europe and elsewhere. Four decades later, however, there's a revival around the world, a trend that has been given a big boost by war in the Middle East. Over 400 nuclear reactors are operational in 31 countries, while about 70 more are under construction. Nuclear power accounts for producing about 10% of the world's electricity, equivalent to about a quarter of all sources of low-carbon power. Nuclear reactors have seen steady improvements, adding more safety features and making them cheaper to build and operate. While Chernobyl and the 2011 Fukushima nuclear disaster in Japan diminished the appetite for such power sources, it was clear years ago that there probably would be a revival, said Fatih Birol, executive director of the International Energy Agency. With the war in the Middle East, "I am 100% sure nuclear is coming back," he added... The United States is the world's largest producer of nuclear power, with 94 operational reactors accounting for about 30% of global generation of nuclear electricity. And it is increasing efforts to develop nuclear energy capacity with a goal to quadruple it by 2050... China operates 61 nuclear reactors and is leading the world in building new units, with nearly 40 under construction with a goal to surpass the U.S. and become the global leader in nuclear capacity. European Commission chief Ursula von der Leyen has acknowledged that it was Europe's "strategic mistake" to cut nuclear energy and outlined new initiatives to encourage building power plants. [In 1990, nuclear energy accounted for roughly a third of Europe's electricity, the article points out, but it's now only about 15%.] Russia, meanwhile, has taken a strong lead in exporting its nuclear know-how, building 20 reactors worldwide... Japan has restarted 15 reactors after reviewing the lessons of the earthquake and tsunami that damaged the Fukushima plant, and 10 more are in the process of getting approval to restart. South Africa has the only nuclear power plant on the African continent, although Russia is building one in Egypt, and several other African nations are exploring the technology... With 57 reactors at 19 plants, France relies on nuclear power for nearly 70% of its electricity. The article includes an interactive graphic that shows the growth in the world's nuclear capacity slowing down soon after the 1986 Chernobyl meltdown - with that capacity broken down by country. But it's still increased by roughly 50%. Even Ukraine - the site of the accident - now "still relies heavily on nuclear plants to generate about half of its electricity," the article points out. But Germany "switched off its last three nuclear reactors in 2023."

Read more of this story at Slashdot.

26 Apr 2026 7:34am GMT

Is AI Cannibalizing Human Intelligence? A Neuroscientist's Way to Stop It

The AI industry is largely failing to ask a key design question, argues theoretical neuroscientist/cognitive scientist Vivienne Ming. Are their AI products building human capacity or consuming it? In the Wall Street Journal Ming shares her experiment about which group performed best at predicting real-world events (compared to forecasters on prediction market Polymarket) - AI, human, or human-AI hybrid teams. The human groups performed poorly, relying on instinct or whatever information had come across their feeds that morning. The large AI models - ChatGPT and Gemini, in this case - performed considerably better, though still short of the market itself. But when we combined AI with humans, things got more interesting. Most hybrid teams used AI for the answer and submitted it as their own, performing no better than the AI alone. Others fed their own predictions into AI and asked it to come up with supporting evidence. These "validators" had stumbled into a classic confirmation bias-loop: the sycophancy that leads chatbots to tell you what you want to hear, even if it isn't true. They ended up performing worse than an AI working solo. But in roughly 5% to 10% of teams, something different emerged. The AI became a sparring partner. The teams pushed back, demanding evidence and interrogating assumptions. When the AI expressed high confidence, the humans questioned it. When the humans felt strongly about an intuition, they asked the AI to come up with a counterargument... These teams reached insightful conclusions that neither a human nor a machine could have produced on its own. They were the only group to consistently rival the prediction market's accuracy. On certain questions, they even outperformed it... We are building AI systems specifically designed to give us the answer before we feel the discomfort of not having it. What my experiment suggests is that the human qualities most likely to matter are not the feel-good ones. They're the uncomfortable ones: the capacity to be wrong in public and stay curious; to sit with a question your phone could answer in three seconds and resist the urge to reach for it. To read a confident, fluent response from an AI and ask yourself, "What's missing?" rather than default to "Great, that's done." To disagree with something that sounds authoritative and to trust your instinct enough to follow it. We don't build these capacities by avoiding discomfort. We build them by choosing it, repeatedly, in small ways: the student who struggles through a problem before checking the answer; the person who asks a follow-up question in a conversation; the reader who sits with a difficult idea long enough for it to actually change one's mind. Most AI chatbots today default to easy answers, which is hurting our ability to think critically. I call this the Information-Exploration Paradox. As the cost of information approaches zero, human exploration collapses. We see it in students who perform better on AI-assisted tasks and worse on everything afterward. We see it in developers shipping more code and understanding it less. We are, in ways that feel like progress, slowly optimizing ourselves out of the loop. The author just published a book called " Robot-Proof: When Machines Have All The Answers, Build Better People." They suggest using AI to "explore uncertainty.... before you accept an AI's answer, ask it for the strongest argument against itself." And they're also urging new performance benchmarks for AI-human hybrid teams.

Read more of this story at Slashdot.

26 Apr 2026 4:34am GMT

25 Apr 2026

feedLinuxiac

Void Linux Switches Main NVIDIA Package to Open Kernel Modules

Void Linux Switches Main NVIDIA Package to Open Kernel Modules

Void Linux now uses NVIDIA's open DKMS kernel modules in its main NVIDIA package, starting with the 595.xx driver series.

25 Apr 2026 9:22pm GMT

Colorado Adds Open-Source Exemption to Age-Attestation Bill

Colorado Adds Open-Source Exemption to Age-Attestation Bill

System76's Carl Richell says Colorado SB51 has gained a strong open-source exemption after passing a House committee.

25 Apr 2026 6:37pm GMT

Niri 26.04 Brings Long-Awaited Blur Support to the Wayland Compositor

Niri 26.04 Brings Long-Awaited Blur Support to the Wayland Compositor

Niri 26.04 scrollable-tiling Wayland compositor adds long-awaited blur support, improved screencasting, faster rendering, and more.

25 Apr 2026 2:21pm GMT

feedArs Technica

Artemis II broke Fred Haise's distance record, but he is happy to pass it on

"It wasn't a big deal. It just coincided with the fact that Moon was farther away from the Earth."

25 Apr 2026 11:40am GMT

Palantir employees are talking about company's "descent into fascism"

Slack messages, interviews with current and former works paint picture of company in turmoil.

25 Apr 2026 10:49am GMT

This is who's developing Golden Dome's orbital interceptors—if they're ever built

"If boost-phase intercept from space is not affordable and scalable, we will not produce it."

25 Apr 2026 2:52am GMT