09 May 2026

feedSlashdot

Fiber Optic Cables Can Eavesdrop On Nearby Conversations

sciencehabit shares a report from Science Magazine: Cold War spies planted bugs in walls, lamps, and telephones. Now, scientists warn, the cables themselves could listen in. A fiber optic technique used to detect earthquakes can also pick up the faint vibrations of nearby speech, researchers reported this week here at the general assembly of the European Geosciences Union. Freely available artificial intelligence (AI) software turned the fiber optic data into intelligible, real-time transcripts. "Not many people realize that [fiber optic cables] can detect acoustic waves," says Jack Lee Smith, a geophysicist at the University of Edinburgh who presented the result. "We show that in almost every case where you use these fibers, this could be a privacy concern." Fiber optics can pick up on sound through a technique called distributed acoustic sensing (DAS). Using a machine called an interrogator, researchers fire laser pulses down a cable and record the pattern of reflections coming back from tiny glass defects along the length of the fiber optic. When an earthquake's seismic wave crosses a section of the fiber, it stretches and squeezes the defects, leading to shifts in the reflected light that researchers can use to build a picture of an earthquake. DAS essentially turns a fiber cable into a long chain of seismometers that can detect not only earthquakes, but also the rumblings of volcanoes, cars, and college marching bands. And although scientists set up dedicated fiber lines specifically for research, DAS can also be performed on "dark fiber" -- unused strands in the web of fiber optics that runs through cities and across oceans, carrying the world's internet traffic. DAS can also be used to eavesdrop, the work of Smith and his colleagues shows. They conducted a field test using an existing DAS setup used to study coastal erosion. They set a speaker next to the cable and played pure tones, music, and speech. Human speech contains frequencies ranging from a few hundred to several thousand hertz. The low end of the range could be pulled out of the data "even without any preprocessing," Smith says. "You can easily see acoustic waves." Getting higher frequency speech took a bit of postprocessing, but it was possible. Dumping the data directly into Whisper, a free AI transcription tool, provided accurate real-time transcription. However, this technique worked only for coiled cables, exposed at the surface, at distances of up to 5 meters from the speaker. Burying the cable under just 20 centimeters of dirt was enough to muddy the speech. And straight cables -- even exposed ones right next to the speaker -- did not record speech well.

Read more of this story at Slashdot.

09 May 2026 7:00am GMT

NASA Keeps Track As Mexico City Sinks Into the Ground

An anonymous reader quotes a report from the Guardian: Walking into Mexico City's sprawling central Zocalo is a dizzying experience. At one end of the plaza, the capital's cathedral, with its soaring spires, slumps in one direction. An attached church, known as the Metropolitan Sanctuary, tilts in the other. The nearby National Palace also seems off-kilter. The teetering of many of the capital's historic buildings is the most visible sign of a phenomenon that has been ongoing for more than a century: Mexico City is sinking at an alarming rate. Now, the metropolis's descent is being tracked in real time thanks to one of the most powerful radar systems ever launched into space. Known as Nisar, the satellite can detect minute changes in Earth's surface, even through thick vegetation or cloud cover. "Nisar takes radar imaging observations of Earth to the next level," said Marin Govorcin, a scientist at Nasa's jet propulsion laboratory. "Nisar will see any change big or small that happens on Earth from week to week. No other imaging mission can claim this." Though not the first time that Mexico City's sinking has been observed from space, the Nisar mission has provided a greater sense of how far the sinking spreads and how it changes across different types of land than any other space-based sensor. It has also been able to penetrate areas on the outskirts of the city that were previously challenging to study because of the complex terrain. The implications of the imagery extend far beyond the Mexican capital. "This study of Mexico City speaks to the realm of possibilities that will open up thanks to the Nisar system," said Dario Solano-Rojas, an engineer at the National Autonomous University of Mexico (Unam). "And not just for sinking cities but also for studying volcanoes, for studying the deformation associated with earthquakes, for studying landslides." According to Nasa, the technology is also capable of monitoring the climate crisis, glacier sliding, agricultural productivity, soil moisture, forestry, coastal flooding and more. The Nisar system found that some parts of the city are dropping by more than 2cm a month. "First documented in 1925, the city's sinking is a result of centuries of exploitation of the groundwater," the report says. "Because Mexico City and its surrounds were built on an ancient lake bed, the soil beneath the city is extremely soft. When water is pumped out of the aquifer below, this clay-like earth compacts, resulting in a city that is quietly sinking." The crisis is also self-reinforcing: as the city sinks, aging pipes crack and leak, causing Mexico City to lose an estimated 40% of its water, even as drought and climate change make supplies more fragile.

Read more of this story at Slashdot.

09 May 2026 3:30am GMT

08 May 2026

feedOSnews

Google is tying reCAPTCHA to Google Play Services, screwing over de-Googled Android users

The ways in which Google can lock you into their ecosystem are often obvious, but sometimes, they're incredibly sneaky and easily missed. CAPTCHA tests are annoying, but at the same time, they can help protect websites from bots. While these tests are already the bane of our internet existence, they are going to get worse for some Android users. A requirement for Google's next-generation reCAPTCHA system will make it a lot harder for de-Googled phones to browse the web. A Reddit user has highlighted a seemingly innocuous support page for Google's reCAPTCHA system. The page in question relates to troubleshooting reCAPTCHA verification on mobile. In the document, it says that you'll need to use a compatible mobile device to complete verification. If you have an Android phone, then that means you'll need to be running Google Play Services version 25.41.30 or higher. ↫ Ryan McNeal at Android Authority When was the last time you actively thought about reCAPTCHA being a Google property? Even then, when was the last time you imagined something as annoying but ultimately basic as a captcha prompt could be used to tie people to Google Play Services, and thus to "blessed" Android? Every time we manage to work around one of these asinine ties to Google Play Services, another one pops up to ruin our day. We're so stupidly tied down to and entirely dependent on two very mid - at best - mobile operating systems, and it's such a stupid own goal for especially everyone outside of the US to just sit there and do nothing about it. Worse yet, it seems we're only tying ourselves down further, while paying for the privilege. At the very least we should be categorising certain services - government ID services, payment services, popular messaging platforms, and a few more - as vital infrastructure, and legally mandate these services have clearly defined and well-documented APIs so anyone is free to make alternative clients. The fact that many people are tied to either iOS or "blessed" Android because of something as stupid as what bank they use or the level of incompetency of their government ID service should be a major crisis in any country that isn't the US. I don't want to use iOS or Android, but nobody is leaving me any choice. It's infuriating.

08 May 2026 11:36pm GMT

feedArs Technica

Manufacturing qubits that can move

It's hard to mix electronic manufacturing and flexible geometry.

08 May 2026 11:13pm GMT

feedSlashdot

Does Fidelity's Reorganization Signal the Beginning of the End for 'Small-Team Agile'?

Longtime Slashdot reader cellocgw writes: Hiding inside another layoff report, Fidelity is reorganizing: "The changes are aimed at moving the teams away from an 'agile' makeup -- comprising smaller, siloed squads -- and toward larger teams built to move faster on projects." OMG, as they say: "Sudden outbreak of common sense." According to the Boston Globe, Fidelity is cutting about 1,000 jobs even as it plans to hire roughly 5,300 new workers, many of them early-career engineers. Half of the 3,300 new workers hired this year "will be in tech or product-related roles," the report says, noting that "about 2,000 of those jobs are currently open, and 400 of them are in tech/product-delivery." "The company also plans to add almost 2,000 new early-career workers, with the goal of making the tech and product-delivery teams more hands-on. In all, that means roughly 5,300 new jobs in the pipeline for Fidelity." The company says AI isn't driving the shift; as cellocgw noted, it's about moving toward larger teams that Fidelity says can move faster on priority projects. The financial services firm also reported a strong 2025 under CEO Abigail Johnson, with managed assets rising 19% from 2024 to $7.1 trillion and revenue climbing 15% to $37.7 billion. "Throughout the company's history, our investments in technology have fueled our growth and customer service capabilities," Johnson wrote in a letter (PDF) included in the company's annual report. "We will continue to prioritize technology initiatives that help us advance digital capabilities, simplify our technology ecosystem, and protect the firm and our customers."

Read more of this story at Slashdot.

08 May 2026 11:00pm GMT

feedArs Technica

Trump reportedly plans to fire FDA Commissioner Marty Makary

The plan isn't final and could change, but his ouster would be no surprise.

08 May 2026 10:10pm GMT

ABC refuses to capitulate to Trump admin, fights FCC probe into The View

FCC chair hasn't been able to bully ABC and owner Disney into submission.

08 May 2026 9:08pm GMT

feedOSnews

Why don’t lowercase letters come right after uppercase letters in ASCII?

With that context, I always found it strange that the designers of ASCII included 6 characters after uppercase Z before starting the lowercase letters. Then it hit me: we have 26 letters in the English alphabet, plus 6 additional characters before lowercase starts: 26 + 6 = 32. If you know anything about computers, powers of 2 tend to stick out. Let's take a look at the binary representations of some characters compared to their lowercase counterparts. ↫ Tyler Hillery I only have a middling understanding of the rest of the article and thus the ultimate reason why ASCII includes those six characters between Z and a, but I think it comes down to making certain operations on uppercase and lowercase letters specifically more elegant. In some deep crevices of my brain all of this makes sense, but I find it very difficult to truly understand and explain as someone who knows little about programming.

08 May 2026 8:52pm GMT

Detecting (or not) the use of -l and -c together in Bourne shells

Many Bourne shells go slightly beyond the POSIX sh specification to also support a '-l' option that makes the shell act as a 'login shell'. POSIX's omission of -l isn't only because it doesn't really talk about login shells at all, it's also because Unix has a special way of marking login shells that goes back very far in its history. The -l option isn't necessarily what login and sshd and so on use, it's something that you can use if you specifically want to get a login shell in an unusual circumstance. Bourne shells also have a '-c <command string>' option that causes the shell to execute the command string rather than be interactive (this is a long standing option that is in POSIX). It may surprise you to hear that most or all Bourne shells that support -l also allow you to use -l and -c together. Basically all Bourne shells interpret this as first executing your .profile and so on, then executing the command string instead of going interactive. One use for this is to non-interactively run a command line in the context of your fully set up shell, with $PATH and other environment variables ready for use. ↫ Chris Siebenmann Now, what if you want to detect the use of these two options combined, for instance to make it so certain parts of your .profile are ignored? It turns out very few Bourne shells actually support this, and that's what Siebenmann's latest post is about.

08 May 2026 8:42pm GMT

18 Apr 2026

feedPlanet Arch Linux

Break the loop, move to Berlin

Break the pattern today or the loop will repeat tomorrow.

18 Apr 2026 12:00am GMT

11 Apr 2026

feedPlanet Arch Linux

Write less code, be more responsible

My thoughts on AI-assisted programming.

11 Apr 2026 12:00am GMT

03 Apr 2026

feedPlanet Arch Linux

800 Rust terminal projects in 3 years

I have discovered and shared ~800 open source Rust CLI projects over the past 3 years.

03 Apr 2026 12:00am GMT