Scientist invents shoe with tiny sensors to help mentor, 89, avoid falling

When a kind-hearted engineer noticed his 89-year-old mentor was unsteady on his feet, he sprang into action and created a futuristic shoe which could one day help him – and scores of other older people – keep their balance.

Peter Langlois’s special shoe features an ingenious insole embedded with hundreds of tiny sensors offering lab-quality, real-time data about his gait which can be displayed on a tablet or mobile phone.

The concept is so clever that University of Bristol inventor Dr Jiayang Li’s prototype will be demonstrated this week to industry experts.

Dr Li, a lecturer in electrical engineering, said: “Peter has been a huge champion of my work since I started my PhD and it’s amazing that he still meticulously edits the research papers of my research group even at the age of 89.

“His mind remains extremely sharp and his dedication is so inspiring.

“One day I noticed he was unsteady on his feet and almost lost his balance.

“It got me thinking this is very risky and could have terrible consequences if it resulted in a fall, especially for people who live alone.

“Then I wondered if the semiconductor technology we’re working on might actually be able to help.”

Dr Li’s previous work developed advanced sensors to more accurately measure people’s lung function and pinpoint how their breathing is restricted.

“I realised we could apply similar techniques to monitor how well people are walking,” he said.

“Mapping their leg gestures in detail could detect risk of falls, helping people like Peter stay safe while also keeping their independence at home.

“Although this highly detailed analysis could be obtained in hospital, the challenge was to make the technology more mobile and accessible in everyday life.

“That’s what makes our shoe so special and such a huge leap forward.”

The science involved creating an advanced microchip – also known as a semiconductor – to read all 253 of the tiny sensors on the shoe sole simultaneously.

The data gathered is used to generate images of the person’s foot, highlighting pressure points and assessing whether they are walking in a balanced way or in danger of falling.

To make the device user-friendly, it runs on a low-voltage battery so it can in principle be powered by small screen devices, including a mobile phone or even a smart watch.

“The power of the microchip is just 100 microwatts so the device could run for around three months before it needs recharging,” Dr Li said.

“Fall prevention is a huge challenge for ageing populations, so the potential to anticipate and avoid that happening with our invention is really exciting.

“When I explained the concept to Peter, he was really touched and is pleased it might one day be manufactured and used to help so many people.”

The science behind the device will be showcased at the Institute of Electrical and Electronics Engineers conference on Wednesday.

“The concept could easily be mass produced, creating a low-cost shoe sole which could transform older people’s lives,” Dr Li said.

“Next, we’ll run a formal clinical evaluation with a larger and more diverse group to validate how well it predicts fall risk, refine the analysis provided by the device it’s connected to, and work with clinical and industry partners to translate it into a scalable product.”

The futuristic shoe which can prevent the elderly from falling over

When an engineer noticed his 89-year-old mentor was unsteady on his feet, he created a new, futuristic shoe which could one day help him – and a large number of other elderly people – maintain their balance.

The shoe, worn by Peter Langlois, has an insole with hundreds of tiny sensors offering lab-quality, real-time data about his gait which can be displayed on a tablet or mobile phone.

And now, the prototype by University of Bristol inventor Dr Jiayang Li will be demonstrated to industry experts this week.

Dr Li, a lecturer in electrical engineering, said: “Peter has been a huge champion of my work since I started my PhD and it’s amazing that he still meticulously edits the research papers of my research group even at the age of 89.

“His mind remains extremely sharp and his dedication is so inspiring.

“One day I noticed he was unsteady on his feet and almost lost his balance.

“It got me thinking this is very risky and could have terrible consequences if it resulted in a fall, especially for people who live alone.

“Then I wondered if the semiconductor technology we’re working on might actually be able to help.”

Dr Li’s previous work developed advanced sensors to more accurately measure people’s lung function and pinpoint how their breathing is restricted.

“I realised we could apply similar techniques to monitor how well people are walking,” he said.

“Mapping their leg gestures in detail could detect risk of falls, helping people like Peter stay safe while also keeping their independence at home.

“Although this highly detailed analysis could be obtained in hospital, the challenge was to make the technology more mobile and accessible in everyday life.

“That’s what makes our shoe so special and such a huge leap forward.”

The science involved creating an advanced microchip – also known as a semiconductor – to read all 253 of the tiny sensors on the shoe sole simultaneously.

The data gathered is used to generate images of the person’s foot, highlighting pressure points and assessing whether they are walking in a balanced way or in danger of falling.

To make the device user-friendly, it runs on a low-voltage battery so it can in principle be powered by small screen devices, including a mobile phone or even a smart watch.

“The power of the microchip is just 100 microwatts so the device could run for around three months before it needs recharging,” Dr Li said.

“Fall prevention is a huge challenge for ageing populations, so the potential to anticipate and avoid that happening with our invention is really exciting.

“When I explained the concept to Peter, he was really touched and is pleased it might one day be manufactured and used to help so many people.”

The science behind the device will be showcased at the Institute of Electrical and Electronics Engineers conference on Wednesday.

“The concept could easily be mass produced, creating a low-cost shoe sole which could transform older people’s lives,” Dr Li said.

“Next, we’ll run a formal clinical evaluation with a larger and more diverse group to validate how well it predicts fall risk, refine the analysis provided by the device it’s connected to, and work with clinical and industry partners to translate it into a scalable product.”

From news to politics, travel to sport, culture to climate – The Independent has a host of free newsletters to suit your interests. To find the stories you want to read, and more, in your inbox, click here.

AI camera that ‘restores’ and much more sails to Samsung’s Galaxy Unpacked

What you need to know

  • temp

If your excitement is about to boil over, Samsung’s latest Galaxy camera announcement might just push it to its tipping point.

Today (Feb 17), Samsung announced that it’s going to unveil a Galaxy camera experience designed to “unify photo & video capturing, editing, and sharing into one intuitive system.” The company plans to lean on its AI software heavily for this, bringing capabilities that can reportedly completely transform your photos and videos.

However, these transformations can lean on your desires, too.

Samsung teases that its next generation of Galaxy cameras can not only “restore” missing parts of an image, but it can also “merge” multiple photos into a piece that looks like it was always whole. Elsewhere, Samsung states users can turn daylight photos into night, as well as the ability to “capture detailed photos in low light.”

The teasers don’t end there, as Samsung’s camera software lets you turn real images into stickers, alongside the option of drawing in your desired addition, and watching its AI make it real. If you want to really stress that you saw a UFO capturing a cow, well, you can do that. Samsung highlights its software’s ability to get this all done “within minutes,” pushing other apps out of the user’s mind.

These AI-fueled camera advancements prepare to sail the Galaxy on February 25 at 1 pm ET/10 am PT.

Unpacked is on the way

A week ago, Samsung formally announced that its Galaxy Unpacked event for the Galaxy S26 and more will take place on February 25. The action’s going down in San Francisco, California. Samsung teased that its event will showcase all the ways users can “connect, create, and immerse” themselves in its new technology.

More importantly, the company’s reservation program is already underway. The announcement said: consumers can reserve “the latest Galaxy devices to receive a $30 credit to use during pre-order and be entered for a chance to win a $5,000 gift card to use on Samsung.com. Plus, customers can receive up to $900 in additional savings with a trade-in or receive a $150 credit with no trade-in when you reserve and pre-order on Samsung.com.”

Android Central’s Take

I’d be lying if I said I was surprised by Samsung’s announcement. There’s already a healthy selection of AI-powered tools that users can lean on with the Galaxy S25 series. Now, the company’s just looking to pile it on. I’ve never really been one to use these on a daily basis. It’s nice to see what they “can do” for photography, but I’ve never found myself gravitating toward them again and again. It looks like Samsung’s trying to get that out of its features. The idea of putting their existence and cleverness in the minds of its users, so they return. I’ll have to see them in-hand to see if it sticks.

Like this article? For more stories like this, follow us on MSN by clicking the +Follow button at the top of this page.

Samsung just teased a huge camera system upgrade that could be an AI game-changer for the Galaxy S26

  • Samsung just announced a brand new camera system
  • It’s sure to be a part of the anticipated Galaxy S26 lineup
  • The camera system update should more seamlessly integrate all those Galaxy AI imaging tools

What was once separate could become one, and what was once confusion could finally offer clarity: that’s the potential promise of Samsung’s next big camera system for its upcoming, anticipated lineup of Galaxy S26 smartphones.

After announcing last week that it will hold its big Winter Unpacked event in New York on February 25, where it is expected to reveal new Galaxy S26 smartphones (along, posisbly with Galaxy Buds and a Galaxy Watch update), Samsung is now teasing out some details about the big launch.

In a brief release backed up by a handful of revealing video demos, Samsung said it plans to “unveil a new Galaxy camera experience designed to unify photo & video capturing, editing, and sharing into one intuitive system.”

Among the promised features are:

  • Turn a photo from day to night
  • Restore missing parts of objects
  • Capture detailed photos in low light
  • Merge multiple photos into a single image

This sounds like a mix of old and new features. We could already sketch on images to create new elements, like the spaceship over the cow show in this GIF.

However, the quick replacement of a bite out of a cupcake to make it whole again is a leveling up of AI capabilities.

What’s more interesting here is the potential to have Galaxy AI’s image editing and enhancement tools more deeply integrated with the base camera system. Right now, for instance, the AI editing features live under a Galaxy AI button, and even there, the sketch to image and Generative Edit are separate elements in the tool.

Could all these disparate pieces be fused into one cohesive camera system? Could they appear as tools during image capture?

Here’s how Samsung describes it: “The latest Galaxy AI experiences will bring advanced creative tools to one place, eliminating the need to switch between apps and navigate complex editing software.”

Like its partner Google, Samsung has leaned into generative image manipulation in a way, say, that Apple and its iPhone have not. There’s the “Clean Up” tool in iOS. 26’s Photos app, but that element removal tool is about as far as Apple is currently willing to go in the AI space. For years, we’ve been able to sketch a rudimentary dog on any Galaxy image and let Galaxy AI generate a lifelike dog that looks like it was always part of the photo.

The integration of these powerful AI tools is not unexpected, but it does signal that Samsung is willing to let the AI experience become more seamless until they’re no longer seen as these separate and maybe esoteric things: it’s all just part of the Galaxy S26 phone’s capabilities.

I’m curious if this new camera system also means the tighter integration of all those tools currently hidden under the Galaxy Camera app’s “More” menu. This includes the “Pro” tools, “Proi Video,” “Single Take,” “Panorama,” and more.

In a similar vein, I wonder how deeply Samsung plans to integrate video shooting and editing. While it mentions video, Samsung offered no details on what changes we can expect in the handling of moving images.

Whatever these changes do entail, one thing is clear: the Samsung Galaxy S26 line will combine its hardware camera updates (no one is expecting major lens changes) with a brand new camera platform, one that could be qual parts optical and AI.

Samsung Galaxy Unpacked is sure to be a fascinating unveil. Stick with Sohh.com, which will be on the ground in San Francisco on February 25 at 10 AM PT / 1 PM ET / 6 PM BST (5 AM AEDT on the 26th).

Follow Sohh.comon Google News and add us as a preferred source to get our expert news, reviews, and opinion in your feeds. Make sure to click the Follow button!

And of course, you can also follow Sohh.comonYouTube and TikTok for news, reviews, unboxings in video form, and get regular updates from us on WhatsApp too.

Like this article? For more stories like this, follow us on MSN by clicking the +Follow button at the top of this page.

Aussies outraged over device that blasts irritating noise on a bridge

  • Shrill, noisy devices installed in Perth underpass
  • The technology deters people who are homeless 
  • But it has been condemned by Australians

Aussies have erupted over a device that blasts loud, piercing noises to dissuade homeless people from camping under a bridge.

Perth Council had the technology set up at the Lord Street Bridge underpass near a train line in East Perth over the weekend.

The device emits a shrill, buzzing sound at a pitch that is designed to be uncomfortable for people’s ears.

It is understood the device was then turned off by Tuesday morning.

Jesse Noakes, who runs the independent media publication The Last Place on Earth, shared a recording of the irritating noise.

‘It’s really piercing. It’s almost painful, and that’s the whole idea,’ he said, fingers in his ears. 

‘[Council] want it to be painful for people who are thinking of coming and rolling out a sleeping bag, or chucking a tent on the side of this bike path.

‘That is an audible anti-homeless device. Hostile architecture, built in. 

‘What they’ve done is install some kind of device that is emitting a high-pitched buzzing noise, a shriek.’

Hostile architecture – also referred to as anti-homeless design – is the adaptation of public spaces to deter people from loitering or sleeping.

This can include spikes in doorways, dividers on benches or sloped seats to stop someone from lying down.

Social media users were quick to condemn the devices, highlighting it would be distressing for animals as well as people.

‘That is a disgusting thing for the council to do,’ one said. 

Another said: ‘How about they spend money on actually helping the homeless.’

A third added: ‘In their effort to prevent homeless people from sleeping rough in certain places, they’ve made it hostile to all people. 

‘What about native animals in the area? Are they damaged too? It’s cruel.’

According to Matthew Swain, who has been sleeping rough for two years, he has heard the noise in several areas of Perth.

‘It’s definitely really loud under the underpass,’ he told 7News.

‘I don’t go and stay there because I don’t like encroaching on other people’s spots, and that’s like a known spot for, you know, some crew.’ 

‘I couldn’t stay there with that noise and like where I went to stay last night and was setting up in one of the car parks, one of the Wilson’s car parks, I noticed the noise, not quite at that level, but yeah, had to leave pretty much straight away.’

Daily Mail contacted Perth Council for comment.

The Public Transport Authority (PTA), which manages the sound devices, has confirmed the sounds are no longer being emitted.

‘The City of Perth requested the installation of a noise device at the Lord Street underpass,’ a spokesperson told the Daily Mail.

‘The noise device has been turned off. We will be asking the City of Perth to determine a more suitable way forward.’

The council has previously said the device was part of a ‘broad safety approach’ after ‘ongoing reports of antisocial and criminal behaviour and community concerns about safety in the area’.

‘The City’s focus remains on improving safety and amenity for all users of the area,’ the spokesman told news.com.au.

‘This includes increased security patrols, the installation of CCTV and the deployment of mobile CCTV trailers to support safer access and use of the underpass.’

This is not the first time WA has seen a controversial use of loud sounds to deter homeless people.

In 2023, the City of Bunbury played The Wiggles’ Hot Potato on loop at the Graham Bricknell Music Shell outdoor stage in the town centre, south of Perth.

But that was switched off after the Wiggles intervened, saying the music was created to bring joy to children.

The band said it was disappointed to hear it was being used for another purpose.

Read more