Connect with us

Uncategorized

Data was the new oil, until the oil caught fire

Published

on

We’ve been hearing how “data is the new oil” for more than a decade now, and in certain sectors, it’s a maxim that has more than panned out. From marketing and logistics to finance and product, decision-making is now dominated by data at all levels of most big private orgs (and if it isn’t, I’d be getting a résumé put together, stat).

So it might be a something of a surprise to learn that data, which could transform how we respond to the increasingly deadly disasters that regularly plague us, has been all but absent from much of emergency response this past decade. Far from being a geyser of digital oil, disaster response agencies and private organizations alike have for years tried to swell the scope and scale of the data being inputted into disaster response, with relatively meager results.

That’s starting to change though, mostly thanks to the internet of things (IoT), and frontline crisis managers today increasingly have the data they need to make better decisions across the resilience, response, and recovery cycle. The best is yet to come — with drones flying up, simulated visualizations, and artificial intelligence-induced disasters — what we’re seeing today on the frontlines is only the beginning of what could be a revolution in disaster response in the 2020s.

The long-awaited disaster data deluge has finally arrived

Emergency response is a fight against the fog of war and the dreadful ticking of the clock. In the midst of a wildfire or hurricane, everything can change in a matter of seconds — even milliseconds if you aren’t paying attention. Safe roads ferrying evacuees can suddenly become impassable infernos, evacuation teams can reposition and find themselves spread far too thin, and unforeseen conditions can rapidly metastasize to cover the entire operating environment. An operations center that once had perfect information can quickly find it has no ground truth at all.

Unfortunately, even getting raw data on what’s happening before and during a disaster can be extraordinarily difficult. When we look at the data revolution in business, part of the early success stems from the fact that companies were always heavily reliant on data to handle their activities. Digitalization was and is the key word: moving from paper to computers in order to transform latent raw data into a form that was machine-readable and therefore analyzable. In business, the last ten years was basically upgrading to version two from version one.

In emergency management however, many agencies are stuck without a version at all. Take a flood — where is the water and where is it going? Up until recently, there was no comprehensive data on where waters rose from and where they sloshed to. When it came to wildfires, there were no administrative datasets on where every tree in the world was located and how prone each is to fire. Even human infrastructure like power lines and cell towers often had little interface with the digital world. They stood there, and if you couldn’t see them, they couldn’t see you.

Flood modeling is on the cutting edge of disaster planning and response. Image Credits: CHANDAN KHANNA/AFP via Getty Images

Models, simulations, predictions, analysis: all of these are useless without raw data, and in the disaster response realm, there was no detailed data to be found.

After years of promising an Internet of Things (IoT) revolution, things are finally internet-izing, with IoT sensors increasingly larding up the American and world landscape. Temperature, atmospheric pressure, water levels, humidity, pollution, power, and other sensors have been widely deployed, emitting constant streams of data back into data warehouses ready for analysis.

Take wildfires in the American West. It wasn’t all that long ago that the U.S. federal government and state firefighting agencies had no knowledge of where a blaze was taking place. Firefighting has been “100 years of tradition unimpeded by progress,” Tom Harbour, head of fire response for a decade at the U.S. Forest Service and now chief fire officer at Cornea put it.

And he’s right. After all, firefighting is a visceral activity — responders can see the fires, even feel the burning heat echoing off of their flesh. Data wasn’t useful, particularly in the West where there are millions of acres of land and large swaths are sparsely populated. Massive conflagrations could be detected by satellites, but smoldering fires in the brush would be entirely invisible to the geospatial authorities. There’s smoke over California — exactly what is a firefighter on the ground supposed to do with such valuable information?

Today after a decade of speculative promise, IoT sensors are starting to clear a huge part of this fog. Aaron Clark-Ginsberg, a social scientist at RAND Corporation who researches community resilience, said that air quality sensors have become ubiquitous since they are “very cheap [and] pretty easy to use” and can offer very fine-grained understandings of pollution — a key signal, for instance, of wildfires. He pointed to the company Purple Air, which in addition to making sensors, also produces a popular consumer map of air quality, as indicative of the potential these days for technology.

Maps are the critical intersection for data in disasters. Geospatial information systems (GIS) form the basis for most planning and response teams, and no company has a larger footprint in the sector than privately-held Esri. Ryan Lanclos, who leads public safety solutions at the company, pointed to the huge expansion of water sensors as radically changing responses to certain disasters. “Flood sensors are always pulsing,“ he said, and with a “national water model coming out of the federal government ,” researchers can now predict through GIS analysis how a flood will affect different communities with a precision unheard of previously.

Digital maps and GIS systems are increasingly vital for disaster planning and response, but paper still remains quite ubiquitous. Image Credits: Paul Kitagaki Jr.-Pool/Getty Images

Cory Davis, the director of public safety strategy and crisis response at Verizon (which, through our parent company Verizon Media, is TechCrunch’s ultimate owner), said that all of these sensors have transformed how crews work to maintain infrastructure as well. “Think like a utility that is able to put a sensor on a power line — now they have sensors and get out there quicker, resolve it, and get the power back up.”

He noted one major development that has transformed sensors in this space the last few years: battery life. Thanks to continuous improvements in ultra-low-power wireless chips as well as better batteries and energy management systems, sensors can last a really long time in the wilderness without the need for maintenance. “Now we have devices that have ten-year battery lives,” he said. That’s critical, because it can be impossible to connect these sensors to the power grid in frontier areas.

The same line of thinking holds true at T-Mobile as well. When it comes to preventative planning, Jay Naillon, senior director of national technology service operations strategy at the telco, said that “the type of data that is becoming more and more valuable for us is the storm surge data — it can make it easier to know we have the right assets in place.” That data comes from flood sensors that can offer real-time warnings signals to planners across the country.

Telecom interest — and commercial interest in general — has been critical to accelerating the adoption of sensors and other data streams around disasters. While governments may be the logical end user of flood or wildfire data, they aren’t the only ones interested in this visibility. “A lot of consumers of that information are in the private sector,” said Jonathan Sury, project director at the National Center for Disaster Preparedness at the Earth Institute at Columbia University. “These new types of risks, like climate change, are going to affect their bottom lines,” and he pointed to bond ratings, insurance underwriting and other areas where commercial interest in sensor data has been profound.

Sensors may not literally be ubiquitous, but they have offered a window into the ambiguity that emergency managers have never had visibility into before.

Finally, there is the extensive datasets around mobile usage that have become ubiquitous throughout much of the world. Facebook’s Data for Good project, for instance, provides data layers around connectivity — are users connecting from one place and then later connecting from a different location, indicating displacement? That sort of data from the company and telcos themselves can help emergency planners scout out how populations are shifting in real-time.

Data, data, on the wall — how many AIs can they call?

Rivulets of data have now turned into floods of information, but just like floodwaters rising in cities across the world, the data deluge now needs a response all its own. In business, the surfeit of big data has been wrangled with an IT stack from data warehouses all the way to business intelligence tools.

If only data for disasters could be processed so easily. Data relevant for disasters is held by dozens of different organizations spanning the private, public, and non-profit sectors, leading to huge interoperability problems. Even when the data can be harmonized, there are large challenges in summarizing the findings down to an actual decision a frontline responder can use in their work — making AI a tough sale still today, particularly outside of planning. As Davis of Verizon put it, “now that they have this plethora of data, a lot of cities and federal agencies are struggling with how to use it.”

Unfortunately, standardization is a challenge at all scales. Globally, countries mostly lack interoperability, although standards are improving over time. Amir Elichai, the founder and CEO of 911 call-handling platform Carbyne, said that “from a technology standpoint and a standards standpoint, there is a big difference between countries,” noting that protocols from one country often have to be completely rewritten to serve a different market.

Tom Cotter, director of emergency response and preparedness at health care disaster response organization Project HOPE, said that even setting up communications between responders can be challenging in an international environment. “Some countries allow certain platforms but not others, and it is constantly changing,” he said. “I basically have every single technology communication platform you can possibly have in one place.”

One senior federal emergency management official acknowledged that data portability has become increasingly key in procurement contracts for technology, with the government recognizing the need to buy commercially-available software rather than custom-designed software. That message has been picked up by companies like Esri, with Lanclos stating that “part of our core mission is to be open and … create data and to share that openly to the public or securely through open standards.”

For all its downsides though, the lack of interoperability can be ironically helpful for innovation. Elichai said that the “lack of standards is an advantage — you are not buying into a legacy standard,” and in some contexts where standards are lacking, quality protocols can be built with the assumption of a modern data workflow.

Even with interoperability though, the next challenge becomes data sanitation — and disaster data is dirty as … well, something. While sensor streams can be verified and cross-checked with other datasets, in recent years there has been a heavy increase in the quantity of citizen-submitted information that has to be carefully vetted before it is disseminated to first responders or the public.

With citizens having more access to smartphones than ever, emergency planners have to sanitize uploaded data uploaded in order to verify and make it useful. Image Credits: TONY KARUMBA/AFP via Getty Images

Bailey Farren, CEO and co-founder of disaster communications platform Perimeter, said that “sometimes citizens have the most accurate and real-time information, before first responders show up — we want citizens to share that with …government officials.” The challenge is how to filter the quality goods from the unhelpful or malicious. Raj Kamachee, the CIO of Team Rubicon, a non-profit which assembles teams of volunteer military veterans to respond to natural disasters, said that verification is critical, and it’s a key element of the infrastructure he has built at the organization since joining in 2017. “We’ve gotten more people using it so more feedback [and] more data [is] coming through the pipes,” he said. “So creating a self-service, a very collaborative approach.”

With quality and quantity, the AI models should come, right? Well, yes and no.

Sury of Columbia wants to cool down at least some of the hype around AI. “The big caveat with all of these machine learning and big data applications is that they are not a panacea — they are able to process a lot of disparate information, [but] they’re certainly not going to tell us exactly what to do,” he said. “First responders are already processing a lot of information,” and they don’t necessarily need more guidance.

Instead, AI in disasters is increasingly focused on planning and resilience. Sury pointed to OneConcern, a resiliency planning platform, as one example of how data and AI can be combined in the disaster planning process. He also pointed to the CDC’s Social Vulnerability Index and risk tools from FEMA that integrate different data signals into scalar values by emergency planners to optimize their contingency plans.

Yet, almost everyone I talked to was much more hesitant about the power of AI. As I discussed a bit in part one of this series regarding the disaster sales cycle, data tools have to be real-time and perfect every time given the lives that are on the line. Kamachee of Team Rubicon noted that when choosing tools, he avoids whiz-bang and instead looks at the pure utility of individual vendors. “We go high tech, but we prepare for low tech,” he said, empathizing that in disaster response, everything must be agile and adaptable to changing circumstances.

Elichai of Carbyne saw this pattern in his sales. There’s a “sensitivity in our market and the reluctance from time to time to adopt” new technologies he said, but acknowledged that “there is no doubt that AI at a certain point will provide benefits.”

Naillon of T-Mobile had similar views from the operator perspective, saying that “I can’t say that we really leverage AI very much” in the company’s disaster planning. Instead of AI as brain, the telecom company simply uses data and forecast modeling to optimally position equipment — no fancy GANs required.

Outside of planning, AI has helped in post-disaster recovery, and specifically around damage assessments. After a crisis transpires, assessments of infrastructure and private property have to be made in order for insurance claims to be filed and for a community to move forward. Art delaCruz, COO and president of Team Rubicon, noted that technology and a flourish of AI has helped significantly around damage assessments. Since his organization often helps rebuild communities in the course of its work, triaging damage is a critical element of its effective response strategy.

There’s a brighter future, other than that brightness from the sun that is going to burn us to a crisp, right?

So AI today is helping a bit with resilience planning and disaster recovery and not so much during emergency response itself, but there is certainly more to come across the entire cycle. Indeed, there is a lot of excitement about the future of drones, which are increasingly being used in the field, but there are concerns long term about whether AI and data will ultimately cause more problems than they solve.

Drones would seem to have an obvious value for disaster response, and indeed, they have been used by teams to get additional aerial footage and context where direct access by responders is limited. Kamachee of Team Rubicon noted that in the Bahamas on a mission, response teams used drones to detect survivors, since major roads were blocked. The drones snapped images that were processed using AI, and helped the team to identify those survivors for evacuation. He described drones and their potential as “sexy; very, very cool.”

Aerial views from drones can give disaster response teams much better real-time information, particularly in areas where on-the-ground access is limited. Image Credits: Mario Tama/Getty Images

Cotter of Project HOPE similarly noted that faster data processing translates to better responses. “Ultimately speed is what saves lives in these disasters,” he said. We’re “also able to manage more responses remotely [and] don’t have to send as many people downrange,” giving response teams more leverage in resource-constrained environments.

“I see more emergency management agencies using drone technology — search and rescue, aerial photography,” Davis of Verizon said, arguing that operators often have a mentality of “send a machine into a situation first.” He continued, arguing, “artificial intelligence is going to continue to get better and better and better [and] enable our first responders to respond more effectively, but also more efficiently and safer.”

With data flooding in from sensors and drones and processed and verified better than ever, disaster response can improve, perhaps even better than Mother Nature can galvanize her increasingly deadly whims. Yet, there is one caveat: will the AI algorithms themselves cause new problems in the future?

Clark-Ginsburg of RAND, perhaps supplying that typical RANDian alternatives analysis, said that these solutions can also create problems themselves, “technological risks leading to disaster and the world of technology facilitating disaster.” These systems can break, they can make mistakes, and more ominously — they can be sabotaged to increase chaos and damage.

Bob Kerrey, a co-chair of the 9/11 Commission, former senator and governor of Nebraska, and currently the board chairman of Risk & Return, a disaster response VC fund and philanthropy I profiled recently, pointed to cybersecurity as increasingly a wild card in many responses. “There wasn’t a concept called zero days — let alone a market for zero days — in 2004 [when the 9/11 Commission was doing its work], and now there is.” With the 9/11 terrorist attacks, “they had to come here, they had to hijack planes … now you don’t need to hijack planes to damage the United States,” noting that hackers “can be sitting with a bunch of other guys in Moscow, in Tehran, in China, or even your mother’s basement.”

Data is a revolution in the making for disaster response, but it may well cause a whole second-order set of problems that didn’t exist before. What is giveth is taketh away. The oil gushes, but then the well suddenly runs dry – or simply catches fire.


Future of Technology and Disaster Response Table of Contents


Lyron Foster is a Hawaii based African American Musician, Author, Actor, Blogger, Filmmaker, Philanthropist and Multinational Serial Tech Entrepreneur.

Uncategorized

Brazil’s Divibank raises millions to become the Clearbanc of LatAm

Published

on

Divibank, a financing platform offering LatAm businesses access to growth capital, has closed on a $3.6 million round of seed funding led by San Francisco-based Better Tomorrow Ventures (BTV).

São Paulo-based Divibank was founded in March 2020, right as the COVID-pandemic was starting. The company has built a data-driven financing platform aimed at giving businesses access to non-dilutive capital to finance their growth via revenue-share financing.

“We are changing the way entrepreneurs scale their online businesses by providing quick and affordable capital to startups and SMEs in Latin America,” said co-founder and CEO Jaime Taboada. In particular, Divibank is targeting e-commerce and SaaS companies although it also counts edtechs, fintechs and marketplaces among its clients.

The company is now also offering marketing analytics software for its clients so they can “get more value out of the capital they receive.”

A slew of other investors participated in the round, including existing backer MAYA Capital and new investors such as Village Global, Clocktower Ventures, Magma Partners, Gilgamesh Ventures, Rally Cap Ventures and Alumni Ventures Group. A group of high-profile angel investors also put money in the round, including Rappi founder and president Sebastian Mejia, Tayo Oviosu (founder/CEO of Paga, who participated via Kairos Angels), Ramp founder and CTO Karim Atiyeh and Bread founders Josh Abramowitz and Daniel Simon.

In just over a year’s time, Divibank has seen some impressive growth (albeit from a small base). In the past six months alone, the company said it has signed on over 50 new clients; seen its total loan issuance volume increase by 7x; revenues climb by 5x; customer base increase by 11x and employee base by 4x. Customers include Dr. Jones, CapaCard and Foodz, among others.

“Traditional banks and financial institutions do not know how to evaluate internet businesses, so they generally do not offer loans to these companies. If they do, it is generally a long and tedious process at a very high cost,” Taboada said. “With our revenue-share offering, the entrepreneur does not have to pledge his home, drown in credit card debts or even give up his equity to invest in marketing and growth.”

For now, Divibank is focused on Brazil, considering the country is huge and has more than 11 million SMEs “with many growth opportunities to explore,” according to Taboada. It’s looking to expand to the rest of LatAm and other emerging markets in the future, but no timeline has yet been set.

As in many other sectors, the COVID-19 pandemic served as a tailwind to Divibank’s business, considering it accelerated the digitalization of everything globally.

“We founded Divibank the same week as the lockdown started in Brazil, and we saw many industries that didn’t traditionally advertise online migrate to Google and Facebook Ads rapidly,” Taboada told TechCrunch. “This obviously helped our thesis a lot, as many of our clients had actually recently went from only selling offline to selling mostly online. And there’s no better way to attract new clients online than with digital ads.”

Divibank will use its new capital to accelerate its product roadmap, scale its go-to-market strategy and ramp up hiring. Specifically, it will invest more aggressively in engineering/tech, sales, marketing, credit risk and operations. Today the team consists of eight employees in Brazil, and that number will likely grow to more than 25 or 30 in the coming 12 months, according to Taboada.

The startup is also developing what it describes as “value additive” software, aimed at helping clients better manage their digital ads campaigns and “optimize their investment returns.”

Looking ahead, Divibank is working on a few additional financial products for its clients, targeting the more than $205 billion e-commerce and SaaS markets in Latin America with offerings such as inventory financing and recurring revenue securitizations. Specifically, it plans to continue developing its banking tech platform by “automating the whole credit process,” developing its analytics platform and building its data science/ML capabilities to improve its credit model.

Jake Gibson, general partner at Better Tomorrow Ventures, noted that his firm is also an investor in Clearbanc, which also provided non-dilutive financing for founders. The company’s “20-minute term sheet” product, perhaps its most well-known in tech, allowed e-commerce companies to raise non-dilutive marketing growth capital between $10,000 to $10 million based on its revenue and ad spend.

“We are very bullish on the idea that not every company should be funded with venture dollars, and that lack of funding options can keep too many would-be entrepreneurs out of the market,” he said. “Combine that with the growth of e-commerce in Brazil and LatAm, and expected acceleration fueled by COVID, and the opportunity to build something meaningful seemed obvious.”

Also, since there aren’t a lot of similar offerings in the region, Better Tomorrow views the space that Divibank is addressing as a “massive untapped market.”

Besides Clearbanc, Divibank is also similar to another U.S.-based fintech, Pipe, in that both companies aim to help clients with SaaS, subscription and other recurring revenue models with new types of financings that can help them grow without dilution.

“Like the e-commerce market, we see the SaaS, and the recurring revenues markets in general, growing rapidly,” Taboada said.

Continue Reading

Uncategorized

Orbite offers a five-star ‘space camp’ for would-be space travelers

Published

on

As private companies like Axiom Space, Blue Origin, Virgin Galactic and SpaceX prepare to ferry private customers to the stars, a whole new market is opening up to train affluent would-be travelers for their future missions. Case in point: space training company Orbite, whose goal is to combine aeronautics and five-star hospitality in its inaugural astronaut training program.

“We’re going to have hundreds, if not thousands of people this decade of the 2020s, who will go to space, but you just don’t get off the couch and strap into a rocket […] you actually have to get mentally prepared, physically prepared, and also spiritually prepared for this out of out of this world journey,” co-founder Jason Andrews told TechCrunch. “And that’s really our role.”

Orbite (the French word for ‘orbit,’ pronounced or-beet) was founded by space and hospitality industry veterans Andrews and Nicolas Gaume. Andrews is an aerospace entrepreneur that founded Spaceflight and BlackSky, while Gaume, a software and game development entrepreneur, sits on the board of his family’s resort and hotel business Groupe Gaume. Last year, Gaume’s business Space Cargo Unlimited shipped a dozen bottles of wine to the International Space Station. They were later retrieved. (When asked how the wine tasted, Gaume told TechCrunch, “It’s a unique product.”)

The program will be led by Brienna Rommes, who previously worked as the director of space training and research at the National Aerospace Training and Research Center. Rommes has trained over 600 people to prepare for spaceflight, including Sir Richard Branson, Orbite said.

Led by Rommes, the program aim to prepare travelers that are determined to reach space, but Andrews also said Orbite can help customers “try before they buy” – give people a taste of spaceflight for those who are unsure whether they’d actually want to board a launch vehicle. This seems to be their main value proposition, by providing a general overview to space travel across different companies, because they’ll also be competing to a degree with the native (and mandatory) training programs of individual private launch companies that are purpose-built to prepare customers for their flight.

Costs remain prohibitively high for the average spacefarer: it’s been reported that a ticket on Axiom’s inaugural commercial launch to the International Space Station costs upwards of $55 million. Orbite’s premium training program comes in at $29,500 per person for the three-day, four-night stay.

In acknowledgement on the premium price tag, the four training program sessions scheduled through the remainder of 2021 will be held at luxury resorts: the Four Seasons Resort in Orlando, Florida, and Hôtel La Co(o)rniche in Pyla-sur-Mer, France. The latter hotel is owned by Groupe Gaume.

Would-be space travelers will be able to experience up to 5 Gs by taking a ride on a high-performance aircraft as well as simulated zero-gravity. To prepare customers mentally and even spiritually, the training program itinerary includes meditation training, a workshop on stress and anxiety management, and individual coaching with staff “to explore personal goals for space, thoughts and asses possible flight options,” the company said. The itinerary also includes virtual reality mission experiences and a ‘Michelin star’ space food tasting.

“We really want to make sure we bridge the gap with more of a sensorial, psychological, even spiritual preparation for the trip,” Gaume said.

The company’s long-term vision is building and operating many training facilities around the world. The first facility will open in 2023 or 2024, though Andrews and Gaume are not yet sharing where it will be located. They did say that the dedicated training facility will offer a range of packages, with some as short as single-day experiences. They will also offer accommodation and hospitality, potentially for the long term – weeks or even months, depending on if we reach a stage in human space travel where we’re sending private citizens to the Moon or even Mars.

Continue Reading

Uncategorized

Treasury Prime raises $20M to scale its banking-as-a-service biz

Published

on

This morning Treasury Prime, a banking-as-a-service startup that delivers its product via APIs, announced that it has closed a $20 million Series B. The capital comes around a year since the startup announced its Series A, and around 1.5 years since it raised its preceding round.

For Treasury Prime, the new capital was an internal affair, with prior investors stepping up to lead its new round of funding. Deciens Capital and QED Investors co-led the round, with Susa Ventures and SaaStr Fund also putting cash into the transaction.

As is increasingly common among insider-led fundraises in recent years, the startup in question was not in dire need of new funding before the new investment came together. In fact, Treasury Prime CEO Chris Dean told TechCrunch that his firm is “super capital efficient” in an interview, adding that it had not tucked into its Series A capital until January of this year.

So, why raise more funds now? To invest aggressively in its business. That plan is cliche for a startup raising new funding, but in the case of Treasury Prime the move isn’t in anticipation of future demand. Dean told TechCrunch that his startups had run into a bottleneck in which it could only take on so much new customer volume. That’s no good for a startup in a competitive sector, so picking up its spend in early 2021 and raising new capital in mid-2021 makes sense as it could help it hire, and absorb more demand, more quickly.

And for Treasury Prime’s preceding backers, the chance to put more capital into a startup that was dealing with more demand than capacity likely wasn’t too hard a choice.  Dean added that to make sure the round’s price was market-reasonable, he pitched around 10 venture capital firms, got three term sheets, and then went with his preceding investor group; if any VC reading this is irked by the move, this is the founder equivalent of private-market investors asking founders to come back to them after they find a lead.

But with the banking-as-a-service market growing, thanks to entrants like Stripe showing up in recent quarters, how does Treasury Prime expect to stay towards the front of its fintech niche? Per Dean, by bringing together banks that want fintech deal volume, and fintechs who need both technology and eventual banking partners. By courting both sides of its market, Treasury Prime hopes to be well-situated for long-term growth.

And its CEO is bullish on the scale of his market.

If you imagine the banking-as-a-service market as merely neobanks, he explained, it’s not that big. But his startup expects the number of companies that want to offer their customers the sort banking capabilities that Treasury Prime and some competitors can offer will be broad. How broad? The best way I can summarize the company’s argument is that, a bit like how vertical SaaS has proven that building software for particular industries can be big business, Treasury Prime expects that banking tools will also be built for similar business categories. Vertical banking, perhaps, integrated into other services.

And it wants to be there, offering the back-end tech, and access to banks that the companies building those services will need.

Fintech is a big and expensive market, and Treasury Prime isn’t busy raising nine-figure rounds — yet, at least. According to PitchBook data, Treasury Prime was valued at just over $40 million at the time of its Series A; the company’s new valuation was presumably higher, though how much is not yet clear.

Let’s see how far it can get with $20 million more as it sheds some of its frugal DNA and looks to burn a little faster.

Continue Reading

Trending