Connect with us

Uncategorized

Cognixion’s brain-monitoring headset enables fluid communication for people with severe disabilities

Published

on

Of the many frustrations of having a severe motor impairment, the difficulty of communicating must surely be among the worst. The tech world has not offered much succor to those affected by things like locked-in syndrome, ALS, and severe strokes, but startup Cognixion aims to with a novel form of brain monitoring that, combined with a modern interface, could make speaking and interaction far simpler and faster.

The company’s One headset tracks brain activity closely in such a way that the wearer can direct a cursor — reflected on a visor like a heads-up display — in multiple directions or select from various menus and options. No physical movement is needed, and with the help of modern voice interfaces like Alexa, the user can not only communicate efficiently but freely access all kinds of information and content most people take for granted.

But it’s not a miracle machine, and it isn’t a silver bullet. Here’s where how it got started.

Overhauling decades-old brain tech

Everyone with a motor impairment has different needs and capabilities, and there are a variety of assistive technologies that cater to many of these needs. But many of these techs and interfaces are years or decades old — medical equipment that hasn’t been updated for an era of smartphones and high-speed mobile connections.

Some of the most dated interfaces, unfortunately, are those used by people with the most serious limitations: those whose movements are limited to their heads, faces, eyes — or even a single eyelid, like Jean-Dominique Bauby, the famous author of “The Diving Bell and the Butterfly.”

One of the tools in the toolbox is the electroencephalogram, or EEG, which involves detecting activity in the brain via patches on the scalp that record electrical signals. But while they’re useful in medicine and research in many ways, EEGs are noisy and imprecise — more for finding which areas of the brain are active than, say, which sub-region of the sensory cortex or the like. And of course you have to wear a shower cap wired with electrodes (often greasy with conductive gel) — it’s not the kind of thing anyone wants to do for more than an hour, let alone all day every day.

Yet even among those with the most profound physical disabilities, cognition is often unimpaired — as indeed EEG studies have helped demonstrate. It made Andreas Forsland, co-founder and CEO of Cognixion, curious about further possibilities for the venerable technology: “Could a brain-computer interface using EEG be a viable communication system?”

He first used EEG for assistive purposes in a research study some five years ago. They were looking into alternative methods of letting a person control an on-screen cursor, among them an accelerometer for detecting head movements, and tried integrating EEG readings as another signal. But it was far from a breakthrough.

A modern lab with an EEG cap wired to a receiver and laptop – this is an example of how EEG is commonly used.

He ran down the difficulties: “With a read-only system, the way EEG is used today is no good; other headsets have slow sample rates and they’re not accurate enough for a real-time interface. The best BCIs are in a lab, connected to wet electrodes — it’s messy, it’s really a non-starter. So how do we replicate that with dry, passive electrodes? We’re trying to solve some very hard engineering problems here.”

The limitations, Forsland and his colleagues found, were not so much with the EEG itself as with the way it was carried out. This type of brain monitoring is meant for diagnosis and study, not real-time feedback. It would be like taking a tractor to a drag race. Not only do EEGs often work with a slow, thorough check of multiple regions of the brain that may last several seconds, but the signal it produces is analyzed by dated statistical methods. So Cognixion started by questioning both practices.

Improving the speed of the scan is more complicated than overclocking the sensors or something. Activity in the brain must be inferred by collecting a certain amount of data. But that data is collected passively, so Forsland tried bringing an active element into it: a rhythmic electric stimulation that is in a way reflected by the brain region, but changed slightly depending on its state — almost like echolocation.

The Cognixion One headset with its dry EEG terminals visible.

They detect these signals with a custom set of six EEG channels in the visual cortex area (up and around the back of your head), and use a machine learning model to interpret the incoming data. Running a convolutional neural network locally on an iPhone — something that wasn’t really possible a couple years ago — the system can not only tease out a signal in short order but make accurate predictions, making for faster and smoother interactions.

The result is sub-second latency with 95-100 percent accuracy in a wireless headset powered by a mobile phone. “The speed, accuracy and reliability are getting to commercial levels —  we can match the best in class of the current paradigm of EEGs,” said Forsland.

Dr. William Goldie, a clinical neurologist who has used and studied EEGs and other brain monitoring techniques for decades (and who has been voluntarily helping Cognixion develop and test the headset), offered a positive evaluation of the technology.

“There’s absolutely evidence that brainwave activity responds to thinking patterns in predictable ways,” he noted. This type of stimulation and response was studied years ago. “It was fascinating, but back then it was sort of in the mystery magic world. Now it’s resurfacing with these special techniques and the computerization we have these days. To me it’s an area that’s opening up in a manner that I think clinically could be dramatically effective.”

BCI, meet UI

The first thing Forsland told me was “We’re a UI company.” And indeed even such a step forward in neural interfaces as he later described means little if it can’t be applied to the problem at hand: helping people with severe motor impairment to express themselves quickly and easily.

Sad to say, it’s not hard to imagine improving on the “competition,” things like puff-and-blow tubes and switches that let users laboriously move a cursor right, right a little more, up, up a little more, then click: a letter! Gaze detection is of course a big improvement over this, but it’s not always an option (eyes don’t always work as well as one would like) and the best eye-tracking solutions (like a Tobii Dynavox tablet) aren’t portable.

Why shouldn’t these interfaces be as modern and fluid as any other? The team set about making a UI with this and the capabilities of their next-generation EEG in mind.

Image of the target Cognixion interface as it might appear to a user, with buttons for yes, no, phrases and tools.

Image Credits: Cognixion

Their solution takes bits from the old paradigm and combines them with modern virtual assistants and a radial design that prioritizes quick responses and common needs. It all runs in an app on an iPhone, the display of which is reflected in a visor, acting as a HUD and outward-facing display.

In easy reach of, not to say a single thought but at least a moment’s concentration or a tilt of the head, are everyday questions and responses — yes, no, thank you, etc. Then there are slots to put prepared speech into — names, menu orders, and so on. And then there’s a keyboard with word- and sentence-level prediction that allows common words to be popped in without spelling them out.

“We’ve tested the system with people who rely on switches, who might take 30 minutes to make 2 selections. We put the headset on a person with cerebral palsy, and she typed our her name and hit play in 2 minutes,” Forsland said. “It was ridiculous, everyone was crying.”

Goldie noted that there’s something of a learning curve. “When I put it on, I found that it would recognize patterns and follow through on them, but it also sort of taught patterns to me. You’re training the system, and it’s training you — it’s a feedback loop.”

“I can be the loudest person in the room”

One person who has found it extremely useful is Chris Benedict, a DJ, public speaker, and disability advocate who himself has Dyskinetic Cerebral Palsy. It limits his movements and ability to speak, but doesn’t stop him from spinning (digital) records at various engagements, however, or from explaining his experience with Cognixion’s One headset over email. (And you can see him demonstrating it in person in the video above.)

DJ Chris Benedict wears the Cognixion Headset in a bright room.

Image Credits: Cognixion

“Even though it’s not a tool that I’d need all the time it’s definitely helpful in aiding my communication,” he told me. “Especially when I need to respond quickly or am somewhere that is noisy, which happens often when you are a DJ. If I wear it with a Bluetooth speaker I can be the loudest person in the room.” (He always has a speaker on hand, since “you never know when you might need some music.”)

The benefits offered by the headset give some idea of what is lacking from existing assistive technology (and what many people take for granted).

“I can use it to communicate, but at the same time I can make eye contact with the person I’m talking to, because of the visor. I don’t have to stare at a screen between me and someone else. This really helps me connect with people,” Benedict explained.

“Because it’s a headset I don’t have to worry about getting in and out of places, there is no extra bulk added to my chair that I have to worry about getting damaged in a doorway. The headset is balanced too, so it doesn’t make my head lean back or forward or weigh my neck down,” he continued. “When I set it up to use the first time it had me calibrate, and it measured my personal range of motion so the keyboard and choices fit on the screen specifically for me. It can also be recalibrated at any time, which is important because not every day is my range of motion the same.”

Alexa, which has been extremely helpful to people with a variety of disabilities due to its low cost and wide range of compatible devices, is also part of the Cognixion interface, something Benedict appreciates, having himself adopted the system for smart home and other purposes. “With other systems this isn’t something you can do, or if it is an option, it’s really complicated,” he said.

Next steps

As Benedict demonstrates, there are people for whom a device like Cognixion’s makes a lot of sense, and the hope is it will be embraced as part of the necessarily diverse ecosystem of assistive technology.

Forsland said that the company is working closely with the community, from users to clinical advisors like Goldie and other specialists, like speech therapists, to make the One headset as good as it can be. But the hurdle, as with so many devices in this class, is how to actually put it on people’s heads — financially and logistically speaking.

Cognixion is applying for FDA clearance to get the cost of the headset — which, being powered by a phone, is not as high as it would be with an integrated screen and processor — covered by insurance. But in the meantime the company is working with clinical and corporate labs that are doing neurological and psychological research. Places where you might find an ordinary, cumbersome EEG setup, in other words.

The company has raised funding and is looking for more (hardware development and medical pursuits don’t come cheap), and has also collected a number of grants.

The One headset may still be some years away from wider use (the FDA is never in a hurry), but that allows the company time to refine the device and include new advances. Unlike many other assistive devices, for example a switch or joystick, this one is largely software-limited, meaning better algorithms and UI work will significantly improve it. While many wait for companies like Neuralink to create a brain-computer interface for the modern era, Cognixion has already done so for a group of people who have much more to gain from it.

You can learn more about the Cognixion One headset and sign up to receive the latest at its site here.

Continue Reading
Comments

Uncategorized

If you don’t want robotic dogs patrolling the streets, consider CCOPS legislation

Published

on

Boston Dynamics’ robot “dogs,” or similar versions thereof, are already being employed by police departments in Hawaii, Massachusetts and New York. Partly through the veil of experimentation, few answers are being given by these police forces about the benefits and costs of using these powerful surveillance devices.

The American Civil Liberties Union, in a position paper on CCOPS (community control over police surveillance), proposes an act to promote transparency and protect civil rights and liberties with respect to surveillance technology. To date, 19 U.S. cities in have passed CCOPS laws, which means, in practical terms, that virtually all other communities don’t have a requirement that police are transparent about their use of surveillance technologies.

For many, this ability to use new, unproven technologies in a broad range of ways presents a real danger. Stuart Watt, a world-renowned expert in artificial intelligence and the CTO of Turalt, is not amused.

Even seemingly fun and harmless “toys” have all the necessary functions and features to be weaponized.

“I am appalled both by the principle and the dogbots and of them in practice. It’s a big waste of money and a distraction from actual police work,” he said. “Definitely communities need to be engaged with. I am honestly not even sure what the police forces think the whole point is. Is it to discourage through a physical surveillance system, or is it to actually prepare people for some kind of enforcement down the line?

“Chunks of law enforcement have forgotten the whole ‘protect and serve’ thing, and do neither,” Watts added. “If they could use artificial intelligence to actually protect and actually serve vulnerable people, the homeless, folks addicted to drugs, sex workers, those in poverty and maligned minorities, it’d be tons better. If they have to spend the money on AI, spend it to help people.”

The ACLU is advocating exactly what Watt suggests. In proposed language to city councils across the nation, the ACLU makes it clear that:

The City Council shall only approve a request to fund, acquire, or use a surveillance technology if it determines the benefits of the surveillance technology outweigh its costs, that the proposal will safeguard civil liberties and civil rights, and that the uses and deployment of the surveillance technology will not be based upon discriminatory or viewpoint-based factors or have a disparate impact on any community or group.

From a legal perspective, Anthony Gualano, a lawyer and special counsel at Team Law, believes that CCOPS legislation makes sense on many levels.

“As police increase their use of surveillance technologies in communities around the nation, and the technologies they use become more powerful and effective to protect people, legislation requiring transparency becomes necessary to check what technologies are being used and how they are being used.”

For those not only worried about this Boston Dynamics dog, but all future incarnations of this supertech canine, the current legal climate is problematic because it essentially allows our communities to be testing grounds for Big Tech and Big Government to find new ways to engage.

Just last month, public pressure forced the New York Police Department to suspend use of a robotic dog, quite unassumingly named Digidog. After the tech hound was placed on temporary leave due to public pushback, the NYPD used it at a public housing building in March. This went over about as well as you could expect, leading to discussions as to the immediate fate of this technology in New York.

The New York Times phrased it perfectly, observing that “the NYPD will return the device earlier than planned after critics seized on it as a dystopian example of overly aggressive policing.”

While these bionic dogs are powerful enough to take a bite out of crime, the police forces seeking to use them have a lot of public relations work to do first. A great place to begin would be for the police to actively and positively participate in CCOPS discussions, explaining what the technology involves, and how it (and these robots) will be used tomorrow, next month and potentially years from now.

Continue Reading

Uncategorized

Bird Rides to go public via SPAC, at an implied value of $2.3B

Published

on

Bird Rides, the shared electric scooter startup that operates in more than 100 cities across 3 continents, said Wednesday it is going public by merging with special purpose acquisition company Switchback II with an implied valuation of $2.3 billion. The announcement confirms earlier reports, including one this week from dot.la, that Bird intended to go public via a SPAC.

Bird said it was able to raise $106 million in private investment in public equity, or PIPE, by institutional investor Fidelity Management & Research Company LLC, and others. Apollo Investment Corp. and MidCap Financial Trust provided an additional $40 million asset financing.

The transaction will enable the combined entity to retain net proceeds of up to $428 million of cash, according to Switchback, which brings $316 million cash-in-trust to the table. The announcement also provided new information about a previously undisclosed $208 million, which Bird raised privately as part of an April 2021 Senior Preferred Convertible equity offering led by Bracket Capital, Sequoia Capital and Valor Equity Partners.

When and how Bird would go public has been an item of speculation after Bloomberg reported last November that the company received “inbound interest” from SPACs.

Bird’s ride has been bumpy at times. In 2020, revenue dropped to $95 million, or 37% from the previous year. That year the company also laid off around 30% of its workforce – 406 people – for cost-saving reasons. The company may use this new access to cash to expand its European operations and pay off debt.

Most importantly, the new injection of cash may help the company finally achieve profitability. It’s a rarity amongst scooter startups, who face notoriously high overhead.

Special purpose acquisition companies, or SPACs, have become a popular route for going public amongst transportation startups. Already this year, scooter company Helbiz, which is based in Europe and the U.S., went public via SPAC in a merger with GreenVision Acquisition Corp. SPAC shell corporations allow companies to list on the NASDAQ without doing a traditional initial public offering.

Continue Reading

Uncategorized

Only three days left to buy $99 passes to TC Disrupt 2021

Published

on

The countdown clock keeps on ticking, and you have just three days to secure your $99 pass to TechCrunch Disrupt 2021. You read that right — $99 is all that you’ll pay, $99 is all (everybody sing)!

Silly Minions aside, you’ll snag serious savings if you buy your Disrupt 2021 pass before the deadline expires on May 14 at 11:59 pm (PT).

TechCrunch Disrupt is a massive gathering of the tech startup world’s top leaders, innovators, makers, investors, founders and ground breakers. The all-virtual platform means more global participation and exposure. It’s all designed to help early-stage founders — and the people who invest in them — build a thriving business.

The Disrupt stage features in-depth interviews and panel discussions with a who’s-who of tech talent. The Extra Crunch stage is where you’ll find a deep bench of subject-matter experts sharing practical how-to content. You’ll take away actionable insights you can put into practice now — when you need it most. Check out our roster of speakers — we’re adding more every week.

Granted, we might be a tad biased about Disrupt — of course we think it’s awesome. But your contemporaries recognize its value, too. Here’s what a few of them told us about their experience at Disrupt 2020.

There was always something interesting going on in one of the breakout rooms, and I was impressed by the quality of the people participating. Partners in well-known VC firms spoke, they were accessible, and they shared smart, insightful nuggets. You will not find this level of people accessible and in one place anywhere else. — Michael McCarthy, CEO, Repositax.

I loved the variety of topics and learning about recent technology trends as they’re happening. Disrupt gave me a whole new perspective on the ways innovation happens in big companies. — Anirudh Murali, co-founder and CEO, Economize.

Watching the Startup Battlefield was fantastic. You could see the ingenuity and innovation happening in different technology spaces. Just looking at the sheer number of other pitch decks and hearing the judges tear them down and give feedback was very helpful. — Jessica McLean, Director of Marketing and Communications, Infinite-Compute.

If watching Startup Battlefield is thrilling (and it is), imagine what it would feel like to compete — or to win. We’re still accepting applications but not for long. Want to take a shot at winning $100,000? Apply to compete in Startup Battlefield before May 13 at 11:59 pm (PT).

There’s so much more opportunity waiting for you at Disrupt 2021. Explore Startup Alley, our expo area. Better yet, exhibit there yourself and, in addition to a bunch of other perks, you might be one of only 50 exhibiting startups chosen to participate in the Startup Alley+ VIP experience. Read more about Startup Alley+ here. TechCrunch will notify selected startups at the end of June.

Time is running out, and $99 is all that you’ll pay — if you buy your Disrupt 2021 pass before Friday, May 14 at 11:59 pm (PT).

Is your company interested in sponsoring or exhibiting at Disrupt 2021? Contact our sponsorship sales team by filling out this form.


Continue Reading

Trending