Connect with us

Uncategorized

Envisioning better health outcomes for all

Published

on

The current covid-19 pandemic has shined the spotlight on longstanding health inequities for people of color. According to the Centers for Disease Control and Prevention, compared to the general United States population, African Americans are 1.4 times more likely to contract the coronavirus, and 2.8 times more likely to die from covid-19. Similarly, Native Americans and Hispanics/Latinos are nearly twice as likely to be infected by coronavirus, and 2.5 to 2.8 times more likely to die from it.

Underlying these statistics are significant structural, social, and spatial issues. But why is this? And how do we begin to quantify and address the nested problems of public health inequality?

Understanding the geography of health inequity

One tool that can help us understand the higher coronavirus infection and death rate among people of color is mapping produced by a geographic information system (GIS). GIS correlates geography to key issues by layering relevant, sometimes seemingly disparate data to achieve clarity on complex situations.

For instance, one of the first things GIS users and epidemiologists mapped in the pandemic was the locations of vulnerable populations. Each layer of data took into account various contributing factors to such vulnerability. These include potential exposure through essential jobs; disease susceptibility for seniors and people with certain health conditions; the risk of transmission for public transit commuters and those in group living situations; and socioeconomic disadvantages through poverty, inadequate education, and lack of health insurance. The dynamic analyses that GIS enabled immediately guided actions by first responders and gave epidemiologists an evidenced-based way to assess vulnerability against hospital accessibility and capacity.

As awareness of the disproportionate number of deaths in communities of color grew, the same tool was applied to understand the causes behind this inequity, which, in turn, can aid in defining and developing potential solutions.

Mapping covid-19 cases across Europe

It’s been long understood that people living in inner cities face conditions that have clear correlations to overall health. These include income and education disparity, a low percentage of home ownership, increased exposure to neighborhood pollution, and reduced access to wellness care and reasonably priced fresh food. Another important dataset relevant to the covid crisis is the disproportionate percentage of people of color in service jobs that put them into daily close contact with the virus.

“GIS can help identify where outcome disparities exist, perform analysis to understand root causes, and focus mitigation efforts on places where systemic racism concentrates causal factors,” says Este Geraghty, chief medical officer and health solutions director at GIS vendor Esri. By analyzing all relevant data on a GIS-based smart map, Geraghty says leaders are poised to uncover localized insights that drive potential solutions. This means, “we can provide stopgaps until we have fully equitable systems, ensuring that one day everyone will have the same opportunity to reach their full health potential.”

Geraghty adds, “If you can’t understand all of the contributing factors in context, you might not anticipate potential problems or solutions.”

GIS for effective covid-19 vaccine distribution

Another pandemic-related problem tied closely to geography is how to get covid vaccines to the public in an equitable, safe, and effective manner. GIS provides the tools to analyze prioritized needs, plan distribution networks, guide deliveries, see the real-time status of inoculation missions, and monitor overall progress.

Geraghty developed a covid vaccine distribution approach using GIS. She explains that the first step is to map those facilities currently suitable for distributing the vaccine to the public. Since some vaccines need ultra-cold storage, facilities will have to be differentiated according to that and other storage capabilities. As part of the facility dataset, Geraghty says, GIS can also be used to calculate how many vaccines each facility’s staff can potentially administer in a day. In addition to hospitals, other facility types will need to be considered based on their ability to deliver the vaccine to underserved and remote populations. Facilities might include university health clinics, independent and retail pharmacies, and potentially even work sites willing and able to inoculate employees, among others.

The next step involves mapping the population—not only their locations and numbers, but also according to the categories recommended by the CDC guidance and state-based plans for the phased rollout of the vaccine.

By correlating these two layers of data on the map (facilities and population), it becomes clear which communities aren’t within a reasonable travel time to a vaccination location, based on multiple modes of travel (for example, driving, walking, public transit).

Geraghty explains, “That geographic perspective will help find any gaps. Who is left out? Where are the populations that aren’t within the range of identified facilities?” This is where GIS can improve decision-making by finding options to fill gaps and make sure that everyone has access to the vaccine.

In areas where GIS analysis identifies “gaps” on the map, such as communities or rural areas that aren’t being reached, Geraghty envisions pop-up clinics in places like school gyms, or drive-throughs in large parking lots, or, in some circumstances, personal outreach. For example, Geraghty explains, “People experiencing homelessness may be less likely to show up at a clinic to get a vaccine, so you may have to reach out to them.”

Public communication about vaccination progress offers another opportunity for mapping and spatial thinking. For example, an updated map could give a clear picture of how many people have been vaccinated in different parts of a state or county. The same map could help people figure out when it’s their turn to be vaccinated and where they can go to receive their vaccine. Maps could even help community residents compare wait times among different facilities to guide their choices and offer the best possible experiences.

Geraghty says that organizing covid vaccine distribution in this way can represent hope for people. “If we take this logical and strategic perspective, we can be more efficient in vaccine delivery and enjoy our normal activities much sooner.”

Vulnerable populations, geographic insights

Long before the world was forced to struggle with covid, the connection between geography and solving public health and social issues was very clear. Using GIS to address homelessness is one example.

In Los Angeles County, GIS has been used to map the homeless population by location, and also document and analyze the risk factors that create homelessness in each community. GIS analysis revealed that a predominant risk factor for homelessness in the northern, and especially northwestern part of the county, was veterans with post-traumatic stress disorder (PTSD). Conversely, in the northeast area, the predominant risk factor creating new homelessness was women and children escaping domestic violence.

In Snohomish County, Washington, health-care workers hit the streets to gather the data needed to facilitate such risk-factor mapping. They used GIS to perform the biannual survey and census of homeless people, gathering details on the conditions and needs of 400 people in short order. They collected standard information like the age of people in camps and whether any were veterans and reported whether they saw needles used for drugs.

Once location-specific differences like these are identified, appropriate resources can be deployed on a community-by-community basis, such as targeted social and health services to help specifically with domestic violence, PTSD, addiction, joblessness, or other identified root causes. “Using a geographic perspective, you can allocate resources, which are always limited, in ways that do the most good,” Geraghty says.

Lessons from the pandemic

Addressing disparities related to living conditions, locations, and genetics has always been a factor of disease spread and mortality, but it has never been tracked, measured, and analyzed on such a scale. However, confronting the covid crisis has been an ongoing case of catch-up, trying to find and correlate critical data to save lives, and Geraghty doesn’t want to see that level of frenetic activity repeated.

“Building strong public health preparedness systems means having foundational data ready,” she explains. “For instance, where, relative to the population, are the hospitals, the shelters, blood banks, and key infrastructure? Who are the community players and partners, and what services can they provide, and where?” In March, at the start of the pandemic, there was no comprehensive map of how many beds each hospital had, what percentage were intensive care beds, the number of ventilators available, and how much personal protection equipment was easily obtainable, and from where. “For anything that is health-related infrastructure,” explains Geraghty, “you should have a baseline map and data that you keep updated, as well as population demographic data.”

The crisis has also brought to light other issues; for example, better and more data sharing is needed, as well as clearer governance for which data are acceptable to share, so nothing will delay essential communications among institutions in the next crisis. And improved system interoperability ensuring key systems can work together to keep data fresh and reaction times quick should be a priority. The covid-19 pandemic has been a tragedy in terms of the human toll. But if we can learn from it, perhaps we can make corrections so that all communities and future generations can look forward to better, longer, and healthier lives.

This content was produced by Insights, the custom content arm of MIT Technology Review. It was not written by MIT Technology Review’s editorial staff.

Lyron Foster is a Hawaii based African American Musician, Author, Actor, Blogger, Filmmaker, Philanthropist and Multinational Serial Tech Entrepreneur.

Continue Reading
Comments

Uncategorized

Personal skin problems leads founder to launch skincare startup Nøie, raises $12M Series A

Published

on

Inspired by his own problems with skin ailments, tech founder Daniel Jensen decided there had to be a better way. So, using an in-house tech platform, his Copenhagen-based startup Nøie developed its own database of skin profiles, to better care for sensitive skin.

Nøie has now raised $12m in a Series A funding round led by Talis Capital, with participation from Inventure, as well as existing investors including Thomas Ryge Mikkelsen, former CMO of Pandora, and Kristian Schrøder Hart-Hansen, former CEO of LEO Pharma’s Innovation Lab.

Nøie’s customized skincare products target sensitive skin conditions including acne, psoriasis and eczema. Using its own R&D, Nøie says it screens thousands of skincare products on the market, selects what it thinks are the best, and uses an algorithm to assign customers to their ‘skin family’. Customers then get recommendations for customized products to suit their skin.

Skin+Me is probably the best-known perceived competitor, but this is a prescription provider. Noie is non-prescription.

Jensen said: “We firmly believe that the biggest competition is the broader skincare industry and the consumer behavior that comes with it. I truly believe that in 2030 we’ll be surprised that we ever went into a store and picked up a one-size-fits-all product to combat our skincare issues, based on what has the nicest packaging or the best marketing. In a sense, any new company that emerges in this space are peers to us: we’re all working together to intrinsically change how people choose skincare products. We’re all demonstrating to people that they can now receive highly-personalized products based on their own skin’s specific needs.”

Of his own problems to find the right skincare provider, he said: “It’s just extremely difficult to find something that works. When you look at technology, online, and all our apps and everything, we got so smart in so many areas, but not when it comes to consumer skin products. I believe that in five or 10 years down the line, you’ll be laughing that we really used to just go in and pick up products just off the shelf, without knowing what we’re supposed to be using. I think everything we will be using in the bathroom will be customized.”

Beatrice Aliprandi, principal at Talis Capital, said: “For too long have both the dermatology sector and the skincare industry relied on the outdated ‘one-size-fits-all’ approach to addressing chronic skin conditions. By instead taking a data-driven and community feedback approach, Nøie is building the next generation of skincare by providing complete personalization for its customers at a massive scale, pioneering the next revolution in skincare.”

Continue Reading

Uncategorized

Comms expert and VC Caryn Marooney will detail how to get attention at TC Early Stage

Published

on

We’re thrilled to announce Caryn Marooney is speaking at our upcoming TechCrunch Early Stage virtual event in July. She spoke with us last year and we had to have her back.

Just look at her resume. She was the co-founder and CEO of The Outcast Agency, one of Silicon Valley’s best-regarded public relations firms. She left her company to serve as VP of Global Communication at Facebook, which she did for eight years, overseeing communication for Facebook, Instagram, WhatsApp and Oculus. In 2019 she joined Coatue Management as a general partner, where she went on to invest in Startburst, Supabase, Defined Networks and others.

Needless to say, Marooney is one of the Valley’s experts on getting people’s attention — a skill that’s critical when running a startup, nonprofit or school bake sale.

She said it best last year: “People just fundamentally aren’t walking around caring about this new startup — actually, nobody does.” So how do you get people to care? That’s the trick and why we’re having her back to speak on this evergreen topic.

Watch her presentation from 2020 here. It’s fantastic.

One of the great things about TC Early Stage is that the show is designed around breakout sessions, with each speaker leading a chat around a specific startup core competency (like fundraising, designing a brand, mastering the art of PR and more). Moreover, there is plenty of time for audience Q&A in each session.

Pick up your ticket for the event, which goes down July 8 and 9, right here. And if you do it today, you’ll save a cool $100 off of your registration.


 

Continue Reading

Uncategorized

Lightmatter’s photonic AI ambitions light up an $80M B round

Published

on

AI is fundamental to many products and services today, but its hunger for data and computing cycles is bottomless. Lightmatter plans to leapfrog Moore’s law with its ultra-fast photonic chips specialized for AI work, and with a new $80M round the company is poised to take its light-powered computing to market.

We first covered Lightmatter in 2018, when the founders were fresh out of MIT and had raised $11M to prove that their idea of photonic computing was as valuable as they claimed. They spent the next three years and change building and refining the tech — and running into all the hurdles that hardware startups and technical founders tend to find.

For a full breakdown of what the company’s tech does, read that feature — the essentials haven’t changed.

In a nutshell, Lightmatter’s chips perform certain complex calculations fundamental to machine learning in a flash — literally. Instead of using charge, logic gates, and transistors to record and manipulate data, the chips use photonic circuits that perform the calculations by manipulating the path of light. It’s been possible for years, but until recently getting it to work at scale, and for a practical, indeed a highly valuable purpose has not.

Prototype to product

It wasn’t entirely clear in 2018 when Lightmatter was getting off the ground whether this tech would be something they could sell to replace more traditional compute clusters like the thousands of custom units companies like Google and Amazon use to train their AIs.

“We knew in principle the tech should be great, but there were a lot of details we needed to figure out,” CEO and co-founder Nick Harris told TechCrunch in an interview. “Lots of hard theoretical computer science and chip design challenges we needed to overcome… and COVID was a beast.”

With suppliers out of commission and many in the industry pausing partnerships, delaying projects, and other things, the pandemic put Lightmatter months behind schedule, but they came out the other side stronger. Harris said that the challenges of building a chip company from the ground up were substantial, if not unexpected.

A rack of Lightmatter servers.

Image Credits: Lightmatter

“In general what we’re doing is pretty crazy,” he admitted. “We’re building computers from nothing. We design the chip, the chip package, the card the chip package sits on, the system the cards go in, and the software that runs on it…. we’ve had to build a company that straddles all this expertise.”

That company has grown from its handful of founders to more than 70 employees in Mountain View and Boston, and the growth will continue as it brings its new product to market.

Where a few years ago Lightmatter’s product was more of a well-informed twinkle in the eye, now it has taken a more solid form in the Envise, which they call a ‘general purpose photonic AI accelerator.” It’s a server unit designed to fit into normal datacenter racks but equipped with multiple photonic computing units, which can perform neural network inference processes at mind-boggling speeds. (It’s limited to certain types of calculations, namely linear algebra for now, and not complex logic, but this type of math happens to be a major component of machine learning processes.)

Harris was reticent to provide exact numbers on performance improvements, but more because those improvements are increasing than that they’re not impressive enough. The website suggests it’s 5x faster than an NVIDIA A100 unit on a large transformer model like BERT, while using about 15 percent of the energy. That makes the platform doubly attractive to deep-pocketed AI giants like Google and Amazon, which constantly require both more computing power and who pay through the nose for the energy required to use it. Either better performance or lower energy cost would be great — both together is irresistible.

It’s Lightmatter’s initial plan to test these units with its most likely customers by the end of 2021, refining it and bringing it up to production levels so it can be sold widely. But Harris emphasized this was essentially the Model T of their new approach.

“If we’re right, we just invented the next transistor,” he said, and for the purposes of large-scale computing, the claim is not without merit. You’re not going to have a miniature photonic computer in your hand any time soon, but in datacenters, where as much as 10 percent of the world’s power is predicted to go by 2030, “they really have unlimited appetite.”

The color of math

A Lightmatter chip with its logo on the side.

Image Credits: Lightmatter

There are two main ways by which Lightmatter plans to improve the capabilities of its photonic computers. The first, and most insane sounding, is processing in different colors.

It’s not so wild when you think about how these computers actually work. Transistors, which have been at the heart of computing for decades, use electricity to perform logic operations, opening and closing gates and so on. At a macro scale you can have different frequencies of electricity that can be manipulated like waveforms, but at this smaller scale it doesn’t work like that. You just have one form of currency, electrons, and gates are either open or closed.

In Lightmatter’s devices, however, light passes through waveguides that perform the calculations as it goes, simplifying (in some ways) and speeding up the process. And light, as we all learned in science class, comes in a variety of wavelengths — all of which can be used independently and simultaneously on the same hardware.

The same optical magic that lets a signal sent from a blue laser be processed at the speed of light works for a red or a green laser with minimal modification. And if the light waves don’t interfere with one another, they can travel through the same optical components at the same time without losing any coherence.

That means that if a Lightmatter chip can do, say, a million calculations a second using a red laser source, adding another color doubles that to two million, adding another makes three — with very little in the way of modification needed. The chief obstacle is getting lasers that are up to the task, Harris said. Being able to take roughly the same hardware and near-instantly double, triple, or 20x the performance makes for a nice roadmap.

It also leads to the second challenge the company is working on clearing away, namely interconnect. Any supercomputer is composed of many small individual computers, thousands and thousands of them, working in perfect synchrony. In order for them to do so, they need to communicate constantly to make sure each core knows what other cores are doing, and otherwise coordinate the immensely complex computing problems supercomputing is designed to take on. (Intel talks about this “concurrency” problem building an exa-scale supercomputer here.)

“One of the things we’ve learned along the way is, how do you get these chips to talk to each other when they get to the point where they’re so fast that they’re just sitting there waiting most of the time?” said Harris. The Lightmatter chips are doing work so quickly that they can’t rely on traditional computing cores to coordinate between them.

A photonic problem, it seems, requires a photonic solution: a wafer-scale interconnect board that uses waveguides instead of fiber optics to transfer data between the different cores. Fiber connections aren’t exactly slow, of course, but they aren’t infinitely fast, and the fibers themselves are actually fairly bulky at the scales chips are designed, limiting the number of channels you can have between cores.

“We built the optics, the waveguides, into the chip itself; we can fit 40 waveguides into the space of a single optical fiber,” said Harris. “That means you have way more lanes operating in parallel — it gets you to absurdly high interconnect speeds.” (Chip and server fiends can find that specs here.)

The optical interconnect board is called Passage, and will be part of a future generation of its Envise products — but as with the color calculation, it’s for a future generation. 5-10x performance at a fraction of the power will have to satisfy their potential customers for the present.

Putting that $80M to work

Those customers, initially the “hyper-scale” data handlers that already own datacenters and supercomputers that they’re maxing out, will be getting the first test chips later this year. That’s where the B round is primarily going, Harris said: “We’re funding our early access program.”

That means both building hardware to ship (very expensive per unit before economies of scale kick in, not to mention the present difficulties with suppliers) and building the go-to-market team. Servicing, support, and the immense amount of software that goes along with something like this — there’s a lot of hiring going on.

The round itself was led by Viking Global Investors, with participation from HP Enterprise, Lockheed Martin, SIP Global Partners, and previous investors GV, Matrix Partners and Spark Capital. It brings their total raised to about $113 million; There was the initial $11M A round, then GV hopping on with a $22M A-1, then this $80M.

Although there are other companies pursuing photonic computing and its potential applications in neural networks especially, Harris didn’t seem to feel that they were nipping at Lightmatter’s heels. Few if any seem close to shipping a product, and at any rate this is a market that is in the middle of its hockey stick moment. He pointed to an OpenAI study indicating that the demand for AI-related computing is increasing far faster than existing technology can provide it, except with ever larger datacenters.

The next decade will bring economic and political pressure to rein in that power consumption, just as we’ve seen with the cryptocurrency world, and Lightmatter is poised and ready to provide an efficient, powerful alternative to the usual GPU-based fare.

As Harris suggested hopefully earlier, what his company has made is potentially transformative in the industry and if so there’s no hurry — if there’s a gold rush, they’ve already staked their claim.

Continue Reading

Trending