Saturday, March 31, 2018

Red Hat looks beyond Linux

The Red Hat Linux distribution is turning 25 years old this week. What started as one of the earliest Linux distributions is now the most successful open-source company, and its success was a catalyst for others to follow its model. Today’s open-source world is very different from those heady days in the mid-1990s when Linux looked to be challenging Microsoft’s dominance on the desktop, but Red Hat is still going strong.

To put all of this into perspective, I sat down with the company’s current CEO (and former Delta Air Lines COO) Jim Whitehurst to talk about the past, present and future of the company, and open-source software in general. Whitehurst took the Red Hat CEO position 10 years ago, so while he wasn’t there in the earliest days, he definitely witnessed the evolution of open source in the enterprise, which is now more widespread than every.

“Ten years ago, open source at the time was really focused on offering viable alternatives to traditional software,” he told me. “We were selling layers of technology to replace existing technology. […] At the time, it was open source showing that we can build open-source tech at lower cost. The value proposition was that it was cheaper.”

At the time, he argues, the market was about replacing Windows with Linux or IBM’s WebSphere with JBoss. And that defined Red Hat’s role in the ecosystem, too, which was less about technological information than about packaging. “For Red Hat, we started off taking these open-source projects and making them usable for traditional enterprises,” said Whitehurst.

Jim Whitehurst, Red Hat president and CEO (photo by Joan Cros/NurPhoto via Getty Images)

About five or six ago, something changed, though. Large corporations, including Google and Facebook, started open sourcing their own projects because they didn’t look at some of the infrastructure technologies they opened up as competitive advantages. Instead, having them out in the open allowed them to profit from the ecosystems that formed around that. “The biggest part is it’s not just Google and Facebook finding religion,” said Whitehurst. “The social tech around open source made it easy to make projects happen. Companies got credit for that.”

He also noted that developers now look at their open-source contributions as part of their resumé. With an increasingly mobile workforce that regularly moves between jobs, companies that want to compete for talent are almost forced to open source at least some of the technologies that don’t give them a competitive advantage.

As the open-source ecosystem evolved, so did Red Hat. As enterprises started to understand the value of open source (and stopped being afraid of it), Red Hat shifted from simply talking to potential customers about savings to how open source can help them drive innovation. “We’ve gone from being commeditizers to being innovators. The tech we are driving is now driving net new innovation,” explained Whitehurst. “We are now not going in to talk about saving money but to help drive innovation inside a company.”

Over the last few years, that included making acquisitions to help drive this innovation. In 2015, Red Hat bought IT automation service Ansible, for example, and last month, the company closed its acquisition of CoreOS, one of the larger independent players in the Kubernetes container ecosystem — all while staying true to its open-source root.

There is only so much innovation you can do around a Linux distribution, though, and as a public company, Red Hat also had to look beyond that core business and build on it to better serve its customers. In part, that’s what drove the company to launch services like OpenShift, for example, a container platform that sits on top of Red Hat Enterprise Linux and — not unlike the original Linux distribution — integrates technologies like Docker and Kubernetes and makes them more easily usable inside an enterprise.

The reason for that? “I believe that containers will be the primary way that applications will be built, deployed and managed,” he told me, and argued that his company, especially after the CoreOS acquisition, is now a leader in both containers and Kubernetes. “When you think about the importance of containers to the future of IT, it’s a clear value for us and for our customers.”

The other major open-source project Red Hat is betting on is OpenStack. That may come as a bit of a surprise, given that popular opinion in the last year or so has shifted against the massive project that wants to give enterprises an open source on-premise alternative to AWS and other cloud providers. “There was a sense among big enterprise tech companies that OpenStack was going to be their savior from Amazon,” Whitehurst said. “But even OpenStack, flawlessly executed, put you where Amazon was five years ago. If you’re Cisco or HP or any of those big OEMs, you’ll say that OpenStack was a disappointment. But from our view as a software company, we are seeing good traction.”

Because OpenStack is especially popular among telcos, Whitehurst believes it will play a major role in the shift to 5G. “When we are talking to telcos, […] we are very confident that OpenStack will be the platform for 5G rollouts.”

With OpenShift and OpenStack, Red Hat believes that it has covered both the future of application development and the infrastructure on which those applications will run. Looking a bit further ahead, though, Whitehurst also noted that the company is starting to look at how it can use artificial intelligence and machine learning to make its own products smarter and more secure, but also at how it can use its technologies to enable edge computing. “Now that large enterprises are also contributing to open source, we have a virtually unlimited amount of material to bring our knowledge to,” he said.

 

Friday, March 30, 2018

As marketing data proliferates, consumers should have more control

At the Adobe Summit in Las Vegas this week, privacy was on many people’s minds. It was no wonder with social media data abuse dominating the headlines, GDPR just around the corner, and Adobe announcing the concept of a centralized customer experience record.

With so many high profile breaches in recent years, putting your customer data in a central record-keeping system would seem to be a dangerous proposition, yet Adobe sees so many positives for marketers, it likely sees this as a worthy trade-off.

Which is not to say that the company doesn’t see the risks. Executives speaking at the conference continually insisted that privacy is always part of the conversation at Adobe as they build tools — and they have built in security and privacy safeguards into the customer experience record.

Offering better experiences

The point of the exercise isn’t simply to collect data for data’s sake, it’s to offer consumers a more customized and streamlined experience. How does that work? There was a demo in the keynote illustrating a woman’s experience with a hotel brand.

Brad Rencher, EVP and GM at Adobe Experience Cloud explains Adobe’s Cloud offerings. Photo: Jeff Bottari/Invision for Adobe/AP Images

The mythical woman started a reservation for a trip to New York City, got distracted in the middle and was later “reminded” to return to it via Facebook ad. She completed the reservation and was later issued a digital key to key to her room, allowing to bypass the front desk check-in. Further, there was a personal greeting on the television in her room with a custom message and suggestions for entertainment based on her known preferences.

As one journalist pointed out in the press event, this level of detail from the hotel is not something that would thrill him (beyond the electronic check-in). Yet there doesn’t seem to be a way to opt out of that data (unless you live in the EU and are subject to GDPR rules).

Consumers may want more control

As it turns out, that reporter wasn’t alone. According to a survey conducted last year by The Economist Intelligence Unit in conjunction with ForgeRock, an identity management company, consumers are not just willing sheep that tech companies may think we are.

The survey was conducted last October with 1,629 consumers participating from eight countries including Australia, China, France, Germany, Japan, South Korea, the UK and the US. It’s worth noting that survey questions were asked in the context of Internet of Things data, but it seems that the results could be more broadly applied to any types of data collection activities by brands.

There are a couple of interesting data points that perhaps brands should heed as they collect customer data in the fashion outlined by Adobe. In particular as it relates to what Adobe and other marketing software companies are trying to do to build a central customer profile, when asked to rate the statement, “I am uncomfortable with companies building a “profile” of me to predict my consumer behaviour,” 39 percent strongly agreed with that statement. Another 35 percent somewhat agreed. That would suggest that consumers aren’t necessarily thrilled with this idea.

When presented with the statement, Providing my personal information may have more drawbacks than benefits, 32 percent strongly agreed and 41 percent somewhat agreed.

That would suggest that it is on the brand to make it clearer to consumers that they are collecting that data to provide a better overall experience, because it appears that consumers who answered this survey are not necessarily making that connection.

Perhaps it wasn’t a coincidence that at a press conference after the Day One keynote announcing the unified customer experience record, many questions from analysts and journalists focused on notions of privacy. If Adobe is helping companies gather and organize customer data, what role do they have in how their customers’ use that data, what role does the brand have and how much control should consumers have over their own data?

These are questions we seem to be answering on the fly. The technology is here now or very soon will be, and wherever the data comes from, whether the web, mobile devices or the Internet of Things, we need to get a grip on the privacy implications — and we need to do it quickly. If consumers want more control as this survey suggests, maybe it’s time for companies to give it to them.

Asana introduces Timeline, lays groundwork for AI-based monitoring as the “team brain” for productivity

When workflow management platform Asana announced a $75 million round of funding in January led by former Vice President Al Gore’s Generation Investment Management, the startup didn’t give much of an indication of what it planned to do with the money, or what it was that won over investors to a new $900 million valuation (a figure we’ve now confirmed with the company).

Now, Asana is taking off the wraps on the next phase of its strategy. This week, the company announced a new feature it’s calling Timeline — composite, visual, and interactive maps of the various projects assigned to different people within a team, giving the group a wider view of all the work that needs to be completed, and how the projects fit together, mapped out in a timeline format.

Timeline is a new premium product: Asana’s 35,000 paying users will be able to access it for no extra charge. Those who are among Asana’s millions of free users will have to upgrade to the premium tier to access it.

The Timeline that Asana is making is intended to be used in scenarios like product launches, marketing campaigns and event planning, and it’s not a matter of a new piece of software where you have to duplicate work, but each project automatically becomes a new segment on a team’s Timeline. Viewing projects through the Timeline allows users to identify if different segments are overlapping and adjust them accordingly.

Perhaps one of the most interesting aspects of the Timeline, however, is that it’s the first instalment of a bigger strategy that Asana plans to tackle over the next year to supercharge and evolve its service, making it the go-to platform for helping keep you focused on work, when you’re at work.

While Asana started out as a place where people go to manage the progress of projects, its ambition going forward is to become a platform that, with a machine-learning engine at the back end, will aim to manage a team’s and a company’s wider productivity and workload, regardless of whether they are actively in the Asana app or not.

“The long term vision is to marry computer intelligence with human intelligence to run entire companies,” Asana co-founder Justin Rosenstein said in an interview. “This is the vision that got investors excited.”

The bigger product — the name has not been revealed — will include a number of different features. Some that Rosenstein has let me see in preview include the ability for people to have conversations about specific projects — think messaging channels but less dynamic and more contained. And it seems that Asana also has designs to move into the area of employee monitoring: it has also been working on a widget of sorts that installs on your computer and watches you work, with the aim of making you more efficient.

“Asana becomes a team brain to keep everyone focused,” said Rosenstein.

Given that Asana’s two co-founders, Dustin Moskovitz and Rosenstein, previously had close ties to Facebook — Moskovitz as a co-founder and Rosenstein as its early engineering lead — you might wonder if Timeline and the rest of its new company productivity engine might be bringing more social elements to the table (or desk, as the case may be).

In fact, it’s quite the opposite.

Rosenstein may have to his credit the creation of the “like” button and other iconic parts of the world’s biggest social network, but he has in more recent times become a very outspoken critic of the distracting effects of services like Facebook’s. It’s part of a bigger trend hitting Silicon Valley, where a number of leading players have, in a wave of mea culpa, turned against some of the bigger innovations particularly in social media.

Some have even clubbed together to form a new organization called the Center for Humane Technology, whose motto is “Reversing the digital attention crisis and realigning technology with humanity’s best interests.” Rosenstein is an advisor, although when I tried to raise the issue of the backlash that has hit Facebook on multiple fronts, he responded pretty flatly, “It’s not something I want to talk about right now.” (That’s what keeping focussed is all about, I guess.)

Asana, essentially, is taking the belief that social can become counterproductive when you have to get something done, and applying it to the enterprise environment.

This is an interesting twist, given that one of the bigger themes in enterprise IT over the last several years has been how to turn business apps and software more “social” — tapping into some of the mechanics and popularity of social networking to encourage employees to collaborate and communicate more with each other even when (as is often the case) they are not in the same physical space.

But social working might not be for everyone, all the time. Slack, the wildly popular workplace chat platform that interconnects users with each other and just about every enterprise and business app, is notable for producing “a gazillion notifications”, in Rosenstein’s words, leading to distraction from actually getting things done. “I’m not saying services like Slack can’t be useful,” he explained. (Slack is also an integration partner of Asana’s.) “But companies are realising that, to collaborate effectively, they need more than communication. They need content and work management. I think that Slack has a lot of useful purposes but I don’t know if all of it is good all the time.”

The “team brain” role that Asana envisions may be all about boosting productivity by learning about you and reducing distraction — you will get alerts, but you (and presumably the brain) prioritise which ones you get, if any at all — but interestingly it has kept another feature characteristic of a lot of social networking services: amassing data about your activities and using that to optimise engagement. As Rosenstein described it, Asana will soon be able to track what you are working on, and how you work on it, to figure out your working patterns.

The idea is that, by using machine learning algorithms, you can learn what a person does quickly, and what might take longer, to help plan that person’s tasks better, and ultimately make that person more productive. Eventually, the system will be able to suggest to you what you should be working on and when.

All of that might sound like music to managers’ ears, but for some, employee monitoring programs sound a little alarming for how closely they monitor your every move. Given the recent wave of attention that social media services have had for all the data they collect, it will be interesting to see how enterprise services like this get adopted and viewed. It’s also not at all clear how these sorts of programs will sit in respect of new directives like GDPR in Europe, which put into place a new set of rules for how any provider of an internet service needs to inform users of how their data is used, and any data collecting needs to have a clear business purpose.

Still, with clearly a different aim in mind — helping you work better — the end could justify the means for some, not just for bosses, but for people who might feel overwhelmed with what is on their work plate every day. “When you come in in the morning, you might have a list [many things] to do today,” Rosenstein said. “We take over your desktop to show the one thing you need to do.”

Azure’s availability zones are now generally available

No matter what cloud you build on, if you want to build something that’s highly available, you’re always going to opt to put your applications and data in at least two physically separated regions. Otherwise, if a region goes down, your app goes down, too. All of the big clouds also offer a concept called ‘availability zones’ in their regions to offer developers the option to host their applications in two separate data centers in the same zone for a bit of extra resilience. All big clouds, that is, except for Azure, which is only launching its availability zones feature into general availability today after first announcing a beta last September.

Ahead of today’s launch, Julia White, Microsoft’s corporate VP for Azure, told me that the company’s design philosophy behind its data center network was always about servicing commercial customers with the widest possible range of regions to allow them to be close to their customers and to comply with local data sovereignty and privacy laws. That’s one of the reasons why Azure today offers more regions than any of its competitors, with 38 generally available regions and 12 announced ones.

“Microsoft started its infrastructure approach focused on enterprise organizations and built lots of regions because of that,” White said. “We didn’t pick this regional approach because it’s easy or because it’s simple, but because we believe this is what our customers really want.”

Every availability zone has its own network connection and power backup, so if one zone in a region goes down, the others should remain unaffected. A regional disaster could shut down all of the zones in a single region, though, so most business will surely want to keep their data in at least one additional region.

IoT devices could be next customer data frontier

At the Adobe Summit this week in Las Vegas, the company introduced what could be the ultimate customer experience construct, a customer experience system of record that pulls in information, not just from Adobe tools, but wherever it lives. In many ways it marked a new period in the notion of customer experience management, putting it front and center of the marketing strategy.

Adobe was not alone, of course. Salesforce, with its three-headed monster, the sales, marketing and service clouds, was also thinking of a similar idea. In fact, they spent $6.5 billion dollars last week to buy MuleSoft to act as a data integration layer to access  customer information from across the enterprise software stack, whether on prem, in the cloud, or inside or outside of Salesforce. And they announced the Salesforce Integration Cloud this week to make use of their newest company.

As data collection takes center stage, we actually could be on the edge of yet another data revolution, one that could be more profound than even the web and mobile were before it. That is…the Internet of Things.

Here comes IoT

There are three main pieces to that IoT revolution at the moment from a consumer perspective. First of all, there is the smart speaker like the Amazon Echo or Google Home. These provide a way for humans to interact verbally with machines, a notion that is only now possible through the marriage of all this data, sheer (and cheap) compute power and the AI algorithms that fuel all of it.

Next, we have the idea of a connected car, one separate from the self-driving car. Much like the smart speaker, humans can interact with the car, to find directions and recommendations and that leaves a data trail in its wake. Finally we, have sensors like iBeacons sitting in stores, providing retailers with a world of information about a customer’s journey through the store — what they like or don’t like, what they pick up, what they try on and so forth.

There are very likely a host of other categories too, and all of this information is data that needs to be processed and understood just like any other signals coming from customers, but it also has unique characteristics around the volume and velocity of this data — it is truly big data with all of the issues inherent in processing that amount of data.

The means it needs to be ingested, digested and incorporated into that central customer record-keeping system to drive the content and experiences you need to create to keep your customers happy — or so the marketing software companies tell us, at least. (We also need to consider the privacy implications of such a record, but that is the subject for another article.)

Building a better relationship

Regardless of the vendor, all of this is about understanding the customer better to provide a central data gathering system with the hope of giving people exactly what they want. We are no longer a generic mass of consumers. We are instead individuals with different needs, desires and requirements, and the best way to please us they say, is to understand us so well, that the brand can deliver the perfect experience at exactly the right moment.

Photo: Ron Miller

That involves listening to the digital signals we give off without even thinking about it. We carry mobile, connected computers in our pockets and they send out a variety of information about our whereabouts and what we are doing. Social media acts as a broadcast system that brands can tap into to better understand us (or so the story goes).

Part of what Adobe, Salesforce and others can deliver is a way to gather that information, pull it together into his uber record keeping system and apply a level of machine and learning and intelligence to help further the brand’s ultimate goals of serving a customer of one and delivering an efficient (and perhaps even pleasurable) experience.

Getting on board

At an Adobe Summit session this week on IoT (which I moderated), the audience was polled a couple of times. In one show of hands, they were asked how many owned a smart speaker and about three quarters indicated they owned at least one, but when asked how many were developing applications for these same devices only a handful of hands went up. This was in a room full of marketers, mind you.

Photo: Ron Miller

That suggests that there is a disconnect between usage and tools to take advantage of them. The same could be said for the other IoT data sources, the car and sensor tech, or any other connected consumer device. Just as we created a set of tools to capture and understand the data coming from mobile apps and the web, we need to create the same thing for all of these IoT sources.

That means coming up with creative ways to take advantage of another interaction (and data collection) point. This is an entirely new frontier with all of the opportunity involved in that, and that suggests startups and established companies alike need to be thinking about solutions to help companies do just that.

Wednesday, March 28, 2018

Hewlett Packard Enterprise to move HQ to San Jose

Hewlett Packard Enterprise is moving north from Palo Alto to San Jose. The company will relocate 1,000 employees to a 220,000 square feet space in late 2018. HPE is was spun off from Hewlett-Packard in 2015 and is focused on servers and storage.

This news comes months after HPE announced a different plan in which the company was moving to Santa Clara where Aruba Networks, a company it previously acquired, is headquartered.

HPE is going to occupy six floors in San Jose’s America Center, which is located near a forthcoming Berryessa BART station.

This move is the latest win for San Jose. Google recently announced it would move in the coming years. According to a report in Mercury News, the city of San Jose did not offer HPE any financial incentives.

Microsoft can ban you for using offensive language

A report by CSOOnline presented the possibility that Microsoft would be able to ban “offensive language” from Skype, Xbox, and, inexplicably, Office. The post, which cites Microsoft’s new terms of use, said that the company would not allow users to “publicly display or use the Services to share inappropriate content or material (involving, for example, nudity, bestiality, pornography, offensive language, graphic violence, or criminal activity)” and that you could lose your Xbox Live Membership if you curse out a kid Overwatch.

“We are committed to providing our customers with safe and secure experiences while using our services. The recent changes to the Microsoft Service Agreement’s Code of Conduct provide transparency on how we respond to customer reports of inappropriate public content,” said a Microsoft spokesperson. The company notes that “Microsoft Agents” do not watch Skype calls and that they can only respond to complaints with clear evidence of abuse. The changes, which go into effect May 1, allows Microsoft to ban you from it services if you’re found passing “inappropriate content” or using “offensive language.”

These new rules give Microsoft more power over abusive users and it seems like Microsoft is cracking down on bad behavior on its platforms. This is good news for victims of abuse in private communications channels on Microsoft products and may give trolls pause before they yell something about your mother on Xbox. We can only dare to dream.

GoDaddy to move most of its infrastructure to AWS, not including domain management for its 75M domains

It really is Go Time for GoDaddy. Amazon’s cloud services provider AWS and GoDaddy, the domain registration and management giant, may have competed in the past when it comes to working with small businesses to provide them with web services, but today the two took a step closer together. AWS said that GoDaddy is now migrating “the majority” of its infrastructure to AWS in a multi-year deal that will also see AWS becoming a partner in selling on some products of GoDaddy’s — namely Manaon ged WordPress and GoCentral for managing domains and building and running websites.

The deal — financial terms of which are not being disclosed — is wide-ranging, but it will not include taking on domain management for GoDaddy’s 75 million domains currently under management, a spokesperson for the company confirmed to me.

“GoDaddy is not migrating the domains it manages to AWS,” said Dan Race, GoDaddy’s VP of communications. “GoDaddy will continue to manage all customer domains. Domain management is obviously a core business for GoDaddy.”

The move underscores Amazon’s continuing expansion as a powerhouse in cloud hosting and related services, providing a one-stop shop for customers who come for one product and stay for everything else (not unlike its retail strategy in that regard). Also, it is a reminder of how the economies of scale in the cloud business make it financially challenging to compete if you are not already one of the big players, or lack deep pockets to sustain your business as you look to grow. GoDaddy has been a direct victim of those economics: just last summer, GoDaddy killed off Cloud Servers, its AWS-style business for building, testing and scaling cloud services on GoDaddy infrastructure.

The AWS deal also highlights how GoDaddy is trimming operational costs to improve its overall balance sheet under Scott Wagner, the COO who took over as CEO from Blake Irving at the beginning of this year. 

“As a technology provider with more than 17 million customers, it was very important for GoDaddy to select a cloud provider with deep experience in delivering a highly reliable global infrastructure, as well as an unmatched track record of technology innovation, to support our rapidly expanding business,” said Charles Beadnall, CTO at GoDaddy, in a statement.

AWS provides a superior global footprint and set of cloud capabilities which is why we selected them to meet our needs today and into the future. By operating on AWS, we’ll be able to innovate at the speed and scale we need to deliver powerful new tools that will help our customers run their own ventures and be successful online,” he continued.

AWS said that GoDaddy will be using AWS’s Elastic Container Service for Kubernetes and Elastic Compute Cloud P3 instances, as well as machine learning, analytics, and other database-related and container technology. Race told TechCrunch that the infrastructure components that the company is migrating to AWS currently run at GoDaddy but will be gradually moved away as part of its multi-year migration.

“As a large, high-growth business, GoDaddy will be able to leverage AWS to innovate for its customers around the world,” said Mike Clayville, VP, worldwide commercial sales at AWS, in a statement. “Our industry-leading services will enable GoDaddy to leverage emerging technologies like machine learning, quickly test ideas, and deliver new tools and solutions to their customers with greater frequency. We look forward to collaborating with GoDaddy as they build anew in the cloud and innovate new solutions to help people turn their ideas into reality online.”

 

Spoke looks to create a simpler workplace requests management tool

When Jay Srinivasan’s last company got acquired by Google, he and his co-founders were ready to get going right away — but they couldn’t figure out how to get ramped up or where things were.

That’s sometimes a refrain you’ll hear from employees of companies that are acquired, or any employees really, who suddenly have to get used to a new system of doing things. It can go all the way down to just getting a new laptop with the right software on it. And it’s a pain point that convinced Srinivasan and his co-founders Pratyus Patnaik and David Kaneda to start Spoke, a new tool for trying to solve those workplace management and request tickets — and finally getting your laptop ready so you can get to work. Spoke is launching for general availability to day, and the company says it has raised $28 million to date from investors like Accel, Greylock, and Felicis Ventures.

“Some internal ticketing systems you can use are searchable — as you imagine it finds all the answers, the problem is when you have all that many people you get 10,000 results,” Srinivasan said. “There’s too much to look at. In a larger company, the breaking point tends to be that there are probably a bunch of relevant answers, but there’s no way to find the needle in the haystack. So I really wanted to figure stuff out from scratch.”

With many companies switching to internal collaboration tools like Slack, the theory is that these kinds of requests should be made wherever the employee is. So part of Spoke is an actual bot that exists in Slack, looking to surface the right answers right away from a database of employee knowledge that’s built up over time. But Spoke’s aim, like many workplace tools that look to be simple, is to hide a lot of complex processes behind that chat window in terms of creating request tickets and other employee queries so they can pop in and pop out quickly enough.

The other side for Spoke is for the managers, which then need to handle all of these requests. Spoke converts all those requests made through Slack (and, theoretically, other platforms) and streams them into a feed of tickets which they can then tackle one-on-one. Rather than a complex interface, Spoke aims to create a simple array of buckets that managers can pop in and pop out in order to plow through those requests as quickly as possible. As Spoke gets more and more data about how those requests are initiated — and solved — it can over time get smarter about optimizing that ticketing flow.

“If I’m the IT manager, I don’t want you to have to log into a ticketing system,” Srinivasan said. “We allow you to make a request through Slack. You’re in slack and talk to Spoke and say, hey, I need a new laptop. I want you to stay in slack or teams. And a lot of time is spent on a specialized tool like a ticketing tool — it’s the same thing as a salesperson spending time in a CRM. Slack is a good way to get an input to that tool, but I still need a specialized standalone tool.”

You could consider Spoke as one interpretation of a couple of approaches to make data about the workplace more accessible. While Spoke is going after the bot-ish, come-to-me results route, there are others looking to create more of a centralized Wiki that’s easy to find and search. At the end of the day, both of these are trying to compress the amount of time it takes for employees to find answers to the information that they need, in addition to making it less frustrating. For the latter, there are some startups like Slab that have also raised venture financing.

For Spoke, the more challenging parts may actually come from the platforms where it lives. Slack, for example, is working on tools to make information much more searchable and accessible. It’s investing in tools to, for example, help users find the right person to ask a question in order to get information as fast as possible. As Slack — and other platforms — get more and more data, they can tune those tools themselves and potentially create something in-house that could be more robust. Srinivasan said the goal is to target the whole process of the workplace request in addition to just the search problem that he hopes will make Spoke something more defensible.

“You’re not looking for knowledge, you’re looking for services,” he said. “Let’s say I need a new laptop — by all means you can search Slack to get the answer of who you need to contact. But you still need to follow up and essentially create a request with them. Slack sometimes could solve the information access to knowledge access problem, but even then it doesn’t solve the service issue. Ticketing and request management consists of requests and responses with accountability. You have to make sure nothing falls through the cracks”

Silver Lake is buying a $500M stake in Credit Karma in a massive secondary round

Credit Karma, which once started as a simple credit report system and is now looking to expand into a true financial assistant, announced today it is getting a massive $500 million secondary investment from Silver Lake.

As part of the investment, Credit Karma says it is getting a 23% bump in the valuation from its last secondary round, which was around $3.25 billion. That means the company is now going to be worth roughly $4 billion altogether, while founder and CEO Kenneth Lin will remain the company’s largest shareholder. That, in the end, is likely important for investors and early employees even as they look to get some liquidity as many look to these founders to ensure that they intend to see the company all the way to the end. Silver Lake’s Mike Bingle is joining the company’s board of directors as part of this deal.

As companies stay private longer, those early employees that spend years at a startup before it hits that huge exit may have to wait longer for some kind of payout for their work. Investors, too, face the same dilemma, especially as the early bets are often just taken on a founder and an idea. And compensation packages early on also typically include equity as a significant portion as companies try to use the financing they raise for growth or other purposes. That makes these kinds of secondary rounds important as it shortens the window for at least some liquidation, which could help employees and investors be a little more patient.

Silver Lake is buying common stock in the company, which is now more than a decade old. But it does mean, with some kind of liquidation for shareholders, that it can likely hold off on an IPO for a little longer. It’s still building out it’s cachet as a financial advisory tool, so it may be that they sought to stay private and not be beholden to the quarterly pressures of a public company while they continue to build out that suite of tools.

Credit Karma is increasingly trying to build a suite of tools that will help it expand just beyond a simple credit score notifier. Late last year, Credit Karma rolled out a tool to be the hub for handling everything related to your cars. All of this sums up to its goal to be a financial assistant, and not just a credit report.

Salesforce introduces Integration Cloud on heels of MuleSoft acquisition

Salesforce hasn’t wasted any time turning the MuleSoft acquisition into a product of its own, announcing the Salesforce Integration Cloud this morning.

While in reality it’s too soon to really take advantage of the MuleSoft product set, the company is laying the groundwork for the eventual integration into the Salesforce family with this announcement, which really showcases why Salesforce was so interested in them that they were willing to fork over $6.5 billion.

The company has decided to put their shiny new bauble front and center in the Integration Cloud announcement, so that when they are in the fold, they will have a place for them to hit the ground running

The Integration Cloud itself consists of three broad pieces: The Integration Platform, which will eventually be based on MuleSoft; Integration Builder, a tool that lets you bring together a complete picture of a customer from Salesforce tools, as well as across other enterprise data repositories and finally Integration Experiences, which is designed to help brands build customized experiences based on all the information you’ve learned from the other tools.

For now, it involves a few pieces that are independent of MuleSoft including a workflow tool called Lightning Flow, a new service that is designed to let Salesforce customers build workflows using the customer data in Salesforce CRM.

It also includes a dash of Einstein, Salesforce’s catch-all brand for the intelligence layer that underlies the platform, to build Einstein intelligence into any app.

Salesforce also threw in some Trailhead education components to help customers understand how to best make use of these tools.

But make no mistake, this is a typical Salesforce launch. It is probably earlier than it should be, but it puts the idea of integration out there in the minds of its customers and lays a foundation for a much deeper set of products and services down the road when MuleSoft is more fully integrated into the Salesforce toolset.

For now, it’s important to understand that this deal is about using data to fuel the various pieces of the Salesforce platform and provide the Einstein intelligence layer with information from across the enterprise wherever it happens to live, whether that’s in Salesforce, another cloud application or some on-prem legacy systems.

This should sound familiar to folks attending the Adobe Summit this week in Las Vegas, since it’s eerily similar to what Adobe announced on stage yesterday at the Summit keynote. Adobe is calling it a customer experience system of record, but the end game is pretty much the same: bringing together data about a customer from a variety of sources, building a single view of that customer, and then turning that insight into a customized experience.

That they chose to make this announcement during the Adobe Summit, where Adobe has announced some data integration components of its own could be a coincidence, but probably not.

Tuesday, March 27, 2018

Pure Storage teams with Nvidia on GPU-fueled Flash storage solution for AI

As companies gather increasing amounts of data, they face a choice over bottlenecks. They can have it in the storage component or the backend compute system. Some companies have attacked the problem by using GPUs to streamline the back end problem or Flash storage to speed up the storage problem. Pure Storage wants to give customers the best of both worlds.

Today it announced, Airi, a complete data storage solution for AI workloads in a box.

Under the hood Airi starts with a Pure Storage FlashBlade, a storage solution that Pure created specifically with AI and machine learning kind of processing in mind. NVidia contributes the pure power with four NVIDIA DGX-1 supercomputers, delivering four petaFLOPS of performance with NVIDIA ® Tesla ® V100 GPUs. Arista provides the networking hardware to make it all work together with Arista 100GbE switches. The software glue layer comes from the NVIDIA GPU Cloud deep learning stack and Pure Storage AIRI Scaling Toolkit.

Photo: Pure Storage

One interesting aspect of this deal is that the FlashBlade product operates as a separate product inside of the Pure Storage organization. They have put together a team of engineers with AI and data pipeline understanding with the focus inside the company on finding ways to move beyond the traditional storage market and find out where the market is going.

This approach certainly does that, but the question is do companies want to chase the on-prem hardware approach or take this kind of data to the cloud. Pure would argue that the data gravity of AI workloads would make this difficult to achieve with a cloud solution, but we are seeing increasingly large amounts of data moving to the cloud with the cloud vendors providing tools for data scientists to process that data.

If companies choose to go the hardware route over the cloud, each vendor in this equation — whether Nvidia, Pure Storage or Arista — should benefit from a multi-vendor sale. The idea ultimately is to provide customers with a one-stop solution they can install quickly inside a data center if that’s the approach they want to take.

On-demand shipping startup Shyp is shutting down

After rocketing to a $250 million valuation in 2015 amid a massive hype cycle for on-demand companies, on-demand startup Shyp is shutting down today.

CEO Kevin Gibbon announced that the company would be shutting down in a blog post this afternoon. The company is ending operations immediately after, like many on-demand companies, struggling to find a scalable model beyond its launching point in San Francisco. Shyp missed targets for expanding to cities beyond its core base as well as pulled back from Miami. In July, Shyp said it would be reducing its headcount and shutting down all operations beyond San Francisco.

The company raised $50 million in a deal led by John Doerr at Kleiner Perkins back in 2015, one of his last huge checks as a variety of firms jumped onto the on-demand space. The thesis at the time was pretty sound: look at a strip mall, and see which businesses can come to you first. Shipping was a natural one, but there was also food, and eventually groceries. Today, there are only a few left standing, with Postmates, Instacart and DoorDash among the most prominent ones. Even then, Instacart is now under threat from Amazon, which is ramping up its own two-hour delivery after buying Whole Foods.

“At the time, I approached everything I did as an engineer,” Gibbon wrote. “Rather than change direction, I tasked the team with expanding geographically and dreaming up innovative features and growth tactics to further penetrate the consumer market. To this day, I’m in awe of the vigor the team possessed in tackling a 200-year-old industry. But, growth at all costs is a dangerous trap that many startups fall into, mine included.”

Shyp is now a casualty of the delivery space. Where it originally sought to make up the cost of delivery in the form of cheaper bulk costs for those deliveries, Shyp’s one-size-fits-all delivery — where you could deliver a computer or a bike — eventually ended up being one of the most challenging and frustrating elements of its business. It began adding fees to its online returns business and changing prices for its bulk shipments. As it turns out, a $5 carte blanche for delivery was not a model that really made sense.

Indeed, that growth-at-all-costs directive has cost many startups, with companies like Sprig shutting down and many companies getting slapped on the wrist for aggressive growth tactics like text spamming. It also meant that startups had to very quickly develop an effective playbook that, in the end, might not actually translate to markets beyond their core competency. Shyp pivoted to focusing on businesses toward the tail end of its lifetime, including a big deal with eBay, which we had heard at the time was doing well.

“We decided to keep the popular-but-unprofitable parts of our business running, with small teams of their own behind them,” he wrote. “This was a mistake—my mistake. While large, established companies have the financial freedom to explore new product categories for the sake of exploring, for startups it can be irresponsible.”

But Gibbon said the company kept parts of its popular but challenged models online – which may have also contributed to its eventual shut-down. The company expected to be in cities like Boston, Seattle and Philadelphia in early 2016, but that didn’t end up panning out. And Shyp increasingly felt the challenges of an on-demand model, trying to push the cost to the consumer as low as possible while handling the overheads and logistical headaches of a delivery business.

“My early mistakes in Shyp’s business ended up being prohibitive to our survival,” Gibbon wrote. “For that, I am sorry.”

Rackspace may reportedly go public again after a $4.3B deal took it private in 2016

Rackspace, which was taken private in a $4.3 billion deal in August 2016 by private equity firm Apollo Global Management, is reportedly in consideration for an IPO by the firm, according to a report by Bloomberg.

The company could have an enterprise value of up to $10 billion, according to the report. Rackspace opted to go private in an increasingly challenging climate that faced competition on all sides from much more well capitalized companies like Amazon, Microsoft, and Google. Despite getting an early start in the cloud hosting space, Rackspace found itself quickly focusing on services in order to continue to gain traction. But under scrutiny from Wall Street as a public company, it’s harder to make that kind of a pivot.

Bloomberg reports that the firm has held early talks with advisers and may seek to begin the process by the end of the year, and these processes can always change over time. Rackspace offers managed services, including data migration, architecture to support on-boarding, and ongoing operational support for companies looking to work with cloud providers like AWS, Google Cloud and Azure. Since going private, Rackspace acquired Datapipe, and in July said it would begin working with Pivotal to continue to expand its managed services business.

Rackspace isn’t alone in companies that have found themselves opting to go private, such as Dell going private in 2013 in a $24.4 billion deal, in order to resolve issues with its business model without the quarter-to-quarter fiduciary obligations to public investors. Former Qualcomm executive chairman Paul Jacobs, too, expressed some interest in buying out Qualcomm in a process that would take the company private. There are different motivations for all these operations, but each has the same underlying principle: make some agile moves under the purview of a public owner rather than release financial statements every three months or so and watch the stock continue to tumble.

Should Rackspace actually end up going public, it would both catch a wave of successful IPOs like Zscalar and Dropbox — though things could definitely change by the end of the year — as well as an increased need by companies to manage their services in cloud environments. So, it makes sense that the private equity firm would consider taking it public to capitalize on Wall Street’s interest at this time in the latter half.

A spokesperson for Rackspace said the company does not comment on rumors or speculation. We also reached out to Apollo Global Management and will update the post when we hear back.

Stride, Atlassian’s Slack competitor, hits general availability

Last September, Atlassian launched Stride, it’s take on a Slack-like real-time communications platform for text, audio and video chats, into beta. Six months later, Stride is now generally available to any and all teams that want to give it a try.

While Atlassian is a bit cagey about providing exact user numbers, so the numbers it actually shared aren’t all the useful to gauge the service’s success. What the company was willing to say is that its users have now spent a quarter of a million hours in Stride’s Focus Mode, which is meant to allow worked to reclaim a bit of sanity in today’s notification-driven world by allowing you to turn off all incoming messages and notifications. As Atlassian’s head of communications products Steve Goldsmith told me, the company is happy with the state of Stride and that it’s growing quickly.

Since the closed beta launch, Atlassian has added about 50 new features and improvements to the service that include better ways to organize chat lists, better search and a number of improvements to the service’s video meetings features. Indeed, it’s these video chat features that the team is maybe the most proud of. “Small impromptu meetings don’t just happen when you have to switch context,” Goldsmith told me but declined to give us any numbers for how much time users spend in these chats beyond that “it’s a lot.”

Goldsmith also stressed that this is far from the final version of Stride. The team still has quite a roadmap of features that it wants to implement. But taking away the beta label, though, the company is signalling that it has worked out most of the kinks and that Stride is now ready for full enterprise deployments.

About a month ago, the Stride team also opened up its API to outside developers. Goldsmith was pretty open about the fact that he’s very happy with the final result but that he would’ve liked to see that happen a bit earlier. Stride’s API is the first product that sites on top of Atlassian’s new API platform. That probably made building the API a bit harder, but Goldsmith noted that that now makes integrating with Stride easier for other Atlassian teams.

 

MariaDB acquires big analytics company MammothDB

MariaDB is best known as a drop-in replacement for the popular MySQL database. But the MariaDB Corporation, which was founded by MySQL founder Monty Widenius and which offers all of its software under an open source license, clearly has its sights set on a bigger market and is looking to expand and better challenge the likes of Oracle, the company today announced that it has acquired MammothDB, a big data business analytics service based in Bulgaria.

With MariaDB AX, MariaDB already offers an analytics and data warehousing system. The service launched in 2017 and, unsurprisingly, the company plans to bring the MammothDB’s expertise in this area to bear on MariaDB AX.

“The MammothDB team joins MariaDB at a critical point in our growth, bringing with them an impressive track record of delivering big data solutions,” said MariaDB CEO Michael Howard. “Over the past year, we’ve seen a major increase in demand for MariaDB AX as organizations seek to fill an open source analytics gap left by proprietary offerings such as Oracle and Teradata. The addition of MammothDB’s deep analytics expertise will be invaluable to helping MariaDB meet this growing need and continue to innovate our analytic products.”

The companies did not disclose the price of the acquisition. MammothDB raised a $1.8 million seed round led by 3TS Capital Partners and Empower Capital in 2015 but it doesn’t look like the company ever raised any additional funding. MariaDB, on the other hand, closed a $54 million Series C round led by Alibaba Group and the European Investment Bank in late 2017, which was surely a factor in being able to make today’s acquisition.

Kloudless raises $6M for its integrations solution

Kloudless makes it easier for developers to connect their applications to a variety of third-party tools for file storage, customer management, calendaring and other services through a unified API. It’s a bit like an IFTTT for developers. Today, the company announced that it has raised a $6 million Series A round led by Aspect Ventures, with participation by Bow Capital, Alibaba Taiwan Entrepreneurs Fund, Heavybit, and Ajay Shah. These new investors join existing investors David Sacks and Tim Draper.

The company says that it saw a 200 percent revenue growth over the course of the last year and that its platform now has over 15,000 registered developers who are making more than 15 million API calls every day. Kloudless’s monetization plan mostly focuses on providing developers with different degrees of service, SLAs and features like single-sign on support and rules. Interestingly, the company doesn’t charge based on API calls and offers a generous 50GB of free transfer volume, even for free accounts, with additional transfer charges costing $20 per 100GB.

“Our mission at Kloudless is to tie together the business software stack,” said Eliot Sun, CEO and co-founder of Kloudless. “While we’re starting with a solution for software vendors, this is just a small piece of an enormous opportunity to help all businesses make the most of the data and functionalities from their software investments.”

The company plans to use the new funding to expand its connector ecosystem to support a wider variety of third-party services and to launch new tools to enable automation and custom integration capabilities.

“In the past year, Kloudless has seen accelerating traction across all key developer metrics, as developers have increasingly realized the efficiencies of a ‘build once, integrate many’ approach to meeting customer demand for integrations,” said Mark Kraynak of Aspect Ventures, who will be joining the board as a part of the financing. “We’re excited to support Kloudless and their efforts to capture what figures to be a multi-billion opportunity in connecting businesses to the cloud.”

Adobe wants to be your customer experience record keeping system

For years, the goal of marketers was to understand the customer so well, they could respond to their every need, while creating content specifically geared to their wishes. Adobe Cloud Platform has long acted as a vehicle to collect and understand customer data inside the Adobe toolset, but today Adobe took that a step further.

The company hopes to transform Adobe Cloud Platform into a company’s experience record keeping system, a central place to collect all the data you may have about a customer from both the Adobe Cloud Platform and external data sources.

Suresh Vittal, vice president of platform and product at Adobe Experience Cloud says tools like CRM were intended to provide a record keeping system for the times, and they were fine in a period when entering and retrieving data was state of the art, but he thinks there needs to be something more.

“A lot of investments for past generations of software evolution have been around batch-based operational systems. While they were necessary back then, they are not sufficient where these brands are going today,” he told TechCrunch.

Adobe Systems world headquarters in San Jose, California USA Photo: Getty Images Lisa Werner / Contributor

Over time, as companies gather more and data, Adobe believes they need something that centers around the dynamic interactions brands are having with customers. “We believe every customer needs an experience system of record, a central [place to record] where the brand brings together experience data, content and a unified profile to power the next generation of experience,” he said.

To achieve this goal, the company is doing more than creating a new construct, it has built a new data model along with tools for data scientists to build custom data models.

Of course where there is data, there needs to be some machine learning and artificial intelligence to help process it, especially in a case where the goal is to pull disparate data into a central record. Adobe’s particular flavor of AI is called Sensei and the company is giving developers access to the some of the same AI algorithms it uses in-house to build its platform.

Any time you start pulling data together from a variety of sources to create a central record keeping system about a customer, there are huge privacy implications, and even more so with GDPR coming on line at the beginning of May in the EU. Vittal says the company has built in a governance and compliance layer into the toolset to help companies comply with various regulations around sharing data.

“You cannot turn all of this data into something useful without safeguards— semantics and control.” He says this involves creating a data catalogue, labeling data in the record and associating rules with each type. That way, data emanating from the EU will need to be handled a certain way, just as any personally identifiable information needs to be safeguarded.

This is where the machine learning comes in. “When you create data across the experience system of record, the data catalog recognizes [certain types of] data and recommends labels based on types of data using machine learning.”

All of this is very likely an attempt to compete with Salesforce, which provides sales, marketing and customer service stitched together with their own artificial intelligence layer, Einstein. The recent $6.5 billion MuleSoft purchase will also help in terms of pulling data of disparate enterprise systems and into the various Salesforce tools.

The tools and services announced today give Adobe a fully intelligent, machine learning-driven solution of their own. The whole notion of a customer experience record, while a bit of marketing speak, also serves to help differentiate Adobe from the pack.

Monday, March 26, 2018

Dropbox up another 7% on day two

Dropbox’s surge on the stock market has continued, with the company going up another 7% on its second day on the stock market.

The company saw its shares close at $30.45, giving the company about a $13 billion market cap, fully diluted.

When it priced its IPO, there was a question as to whether Dropbox would surpass the $10 billion valuation it achieved in its last private round. It eliminated those concerns overnight.

The first few days have been a strong indicator of investor demand for the cloud storage company.

To recap, Dropbox initially hoped to price its IPO between $16 and $18, then raised it from $18 to $20. Then it ultimately priced its IPO at $21, closing the day above $28. And it still continues to go up.

Investors like Dropbox’s improving financials.

It brought in $1.1 billion in revenue in its most recent year. This is up from $845 million in revenue the year before and $604 million for 2015.

Yet while it’s been cash flow positive since 2016, it is not profitable. Dropbox lost nearly $112 million last year. But its margins are looking better when compared with losses of $210 million for 2016 and $326 million for 2015.

Although Dropbox is very different than Spotify which intends to list next week, investors will view this favorable debut as a sign that the IPO window is “open,” meaning that there is strong demand for newly public tech companies.

The Linux Foundation launches a deep learning foundation

Despite its name, the Linux Foundation has long been about more than just Linux. These days, it’s a foundation that provides support to other open source foundations and projects like Cloud Foundry, the Automotive Grade Linux initiative and the Cloud Native Computing Foundation. Today, the Linux Foundation is adding yet another foundation to its stable: the LF Deep Learning Foundation.

The idea behind the LF Deep Learning Foundation is to “support and sustain open source innovation in artificial intelligence, machine learning, and deep learning while striving to make these critical new technologies available to developers and data scientists everywhere.”

The founding members of the new foundation include Amdocs, AT&T, B.Yond, Baidu, Huawei, Nokia, Tech Mahindra, Tencent, Univa and ZTE. Others will likely join in the future.

“We are excited to offer a deep learning foundation that can drive long-term strategy and support for a host of projects in the AI, machine learning, and deep learning ecosystems,” said Jim Zemlin, executive director of The Linux Foundation.

The foundation’s first official project is the Acumos AI Project, a collaboration between AT&T and Tech Mahindra that was already hosted by the Linux Foundation. Acumos AI is a platform for developing, discovering and sharing AI models and workflows.

Like similar Linux Foundation-based organizations, the LF Deep Learning Foundation will offer different membership levels for companies that want to support the project, as well as a membership level for non-profits. All LF Deep Learning members have to be Linux Foundation members, too.

Friday, March 23, 2018

Drew Houston on wooing Dropbox’s IPO investors: “We don’t fit neatly into any one mold”

Dropbox went public this morning to great fanfare, with the stock shooting up more than 40% in the initial moments of trading as the enterprise-slash-consumer company looked to convince investors that it could be a viable publicly-traded company.

And for one that Steve Jobs famously called a feature, and not a company, it certainly was an uphill battle to convince the world that it was worth even the $10 billion its last private financing round set. It’s now worth more than that, but that follows a long series of events, including an increased focus on enterprise customers and finding ways to make its business more efficient — like installing their own infrastructure. Dropbox CEO Drew Houston acknowledged a lot of this, as well as the fact that it’s going to continue to face the challenge of ensuring that its users and enterprises will trust Dropbox with some of their most sensitive files.

We spoke with Houston on the day of the IPO to talk a little bit about what it took to get here during the road show and even prior. Here’s a lightly-edited transcript of the conversation:

TC: In light of the problems that Facebook has had surrounding user data and user trust, how has that changed how you think about security and privacy as a priority?

DH: Our business is built on our customers’ trust. Whether we’re private or public, that’s super important to us. I think, to our customers, whether we’re private or public doesn’t change their view. I wouldn’t say that our philosophy changes as we get to bigger and bigger scale. As you can imagine we make big investments here. We have an awesome security team, our first cultural principle is be worthy of trust. This is existential for us.

TC: How’s the vibe now that longtime employees are going to have an opportunity to get rewarded for their work now that you’re a public company?

DH: I think everyone’s just really excited. This is the culmination of a lot of hard work by a lot of people. We’re really proud of the business we’ve built. I mean, building a great company or doing anything important takes time.

TC: Was there something that changed that convinced you to go public after more than a decade of going private, and how do you feel about the pop?

DH: We felt that we were ready. Our business was in great shape. We had a good balance of scale and profitability and growth. As a private company, there are a lot of reasons why it’s been easier to stay private for longer. We’re all proud of the business we’ve built. We see the numbers. We think we’re on to not just a great business, but pioneering a whole new model. We’re taking the best of our consumer roots, combining them with the best parts of software as a service, and it was really gratifying to see investors be excited about it and for the rest of the world to catch on.

TC: As you were on your road show, what were some of the big questions investors were asking?

DH: We don’t fit neatly into any one mold. We’re not a consumer company, and we’re not a traditional enterprise company. We’re basically taking that consumer internet playbook and applying it to business software, combining the virality and scale. Over the last couple years, as we’ve been building that engine, investors are starting to understand that we don’t fit into a traditional mold. The numbers speak to themselves, they can appreciate the unusual combination.

TC: What did you tell them to convince them?

DH: We’re just able to get adoption. Just the fact that we have hundreds of millions of users and we’ve found Dropbox is adopted in millions of companies [was enough evidence]. More than 300,000 of those users are Dropbox Business companies. We spend about half on sales of marketing as a percentage of revenue of a typical software as a service company. Efficiency and scale are the distinctive elements, and investors zero in on that. To be able to acquire customers at that scale and also really efficiently, that’s what makes us stand out. They’ve seen Atlassian be successful with self-serve products, but you can layer on top of that leveraging our freemium and viral elements and our focus on design and building great products.

TC: How do you think about deploying the capital you’ve picked up from the IPO?

DH: So, we’re public because they wanted us to be a public company. But our approach is still the same. First, it’s about getting the best talent in the building and making sure we build the best products, and if you do those things, make sure customers are happy, that’s what works.

TC: What about recruiting?

DH: It’s a big day for dropbox. We’re all really excited about it and hopefully a lot of other people are too.

TC: When you look at your customer acquisition ramp, what does that look like?

DH: I mean, we’ve been making a lot of progress in the past couple of years if you look at growth in subscribers. That will continue. We look at numbers, we have 11 million subscribers, 80% use dropbox for work. But at the same time, we look at the world, there’s 1 billion knowledge workers and growing. We’re not gonna run out of people who need Dropbox.

TC: What about convincing investors about the consumer part of the business? How did you do that?

DH: I think, when you explain that our consumer and cloud storage roots have really become a way for us to efficiently acquire business customers at scale, that helps them understand. Second, it’s easy to focus on how in the consumer realm that the business has been commoditized. There’s all this free space and all this competition. On the other hand, we’ve never lowered prices, we’ve never even given more free space, we know that what our customers really value is the sharing and collaboration, not just the storage. It’s been good to move investors beyond the 2010 understanding of our business.

TC: How did creating your own infrastructure play into your readiness to go public?

DH: When I say that today is the culmination of a lot of events, that’s a great example. We made a many-year investment to migrate off the public cloud. Certainly that was one of the more eye-popping investors watching our gross margins literally double over the last couple of years from burning cash to being cash flow positive. We’ll continue reaching larger and larger scale, and those investments will.

TC: Getting a new guitar any time soon?

DH: I probably should.

Dropbox CEO Drew Houston emphasizes user trust on IPO day amid Facebook’s troubles

Dropbox made its public debut today, with the stock soaring nearly 40% on its first day of trading — meaning the company will now be beholden to the same shareholders that sent the company’s valuation well north of $10 billion.

As a file-sharing and collaboration service, Dropbox’s first principle is going to be user trust, CEO Drew Houston told TechCrunch after the company made its debut. This comes amid a tidal wave of information throughout the week indicating that data on 50 million Facebook users ended up in the hands of Cambridge Analytica several years ago through access gained via an app that was on the Facebook platform. While not a direct breach in the core sense of the word, the leaked data was a considerable breach of trust among Facebook’s users — and as Dropbox looks to crack into the enterprise and also continue to win over consumers, it’ll likely continue to have to increasingly emphasize security and privacy going forward.

“Our business is built on our customers’ trust,” Houston said, asked of its security. “Whether we’re private or public, that’s super important to us. I think, to our customers, whether we’re private or public doesn’t change their view. I wouldn’t say that our philosophy changes as we get to bigger and bigger scale. As you can imagine we make big investments here. We have an awesome security team, our first cultural principle is be worthy of trust. This is existential for us.”

Houston, and Dropbox, aren’t unfamiliar with some of the challenges that come into securing a service that has more than 500 million registered users. Dropbox in 2016 disclosed that it discovered a chunk of user credentials obtained in 2012 had been circulating on the Internet after an employee’s password was acquired and used to access user information. Dropbox, clearly, has recovered from that stumble and has pulled off a successful IPO, but it does underscore the challenges of not only maintaining security, but also user trust and political capital to actually get the business going.

In the end, that may come down to the trust of individual users. A large portion of Dropbox’s 11 million paying customers are, or started off as, the typical consumer. Dropbox’s playbook is a familiar one, first getting consumer adoption and using that to slowly creep into teams that use the tool because it’s easier than existing ones. Those teams adopt it, leading to further adoption, to the point that Dropbox in theory locks in a customer without having to pick up those direct partnerships or spend a ton of money on marketing. Should it stumble at step one, it would have a much steeper ramp to start acquiring the kind of enterprise companies that will help it build a much more robust business.

“We have this set of stated values in the company, and the number one value is literally, be worthy of trust,” Dropbox SVP of engineering, product, and design Quinton Clark said. I have observed and experienced that the protection of our users is very deeply woven in to the DNA of our company. This is why we encrypt the data at rest, in transit, and it’s why our user experience is designed to keep people down the path of keeping things secure by default. You see it in the tools we give admins and the events they look through. We’re very deeply committed to their privacy and security. We’ve never sold data, it’s not in our business model, it’s about the value people get in software.”

While Dropbox at its heart was born as a consumer company — and there are, indeed, hundreds of millions of consumers — it’s also morphed over time into one with an arm looking to crack big businesses. And now that it’s a public company, it will have more intense oversight from public investors who will be scrutinizing its every move and calibrating its valuation as a result of those moves. Dropbox, too, is moving onto its own infrastructure in order to improve its margins and show it can be an operationally efficient business. All this means that, if it’s going to be a successful company, it has to ensure the kinds of snafus like Cambridge Analytica, which sent Facebook’s stock off a cliff, don’t happen.

Dropbox finishes up 36% on first day of trading, valuing company above $11 billion

Dropbox was off to the races on its first day as a public company.

After pricing above the range at $21 per share, raising $756 million, Dropbox kicked off its first day soaring to $31.60, and closing the day at $28.48. This is up almost 36%.

It’s surely a sign of public investor enthusiasm for the cloud storage business, which had initially hoped to price its IPO between $16 and $18 and then raised it from $18 to $20.

It also means that Dropbox closed well above the $10 billion it was valued at its last private round. Its market cap is about $11.1 billion.

Dropbox brought in $1.1 billion in revenue for the last year. This compares to $845 million in revenue the year before and $604 million for 2015.

While it’s been cash flow positive since 2016, it is not yet profitable, having lost nearly $112 million last year. But it is significantly improved margins when compared to losses of $210 million for 2016 and $326 million for 2015.

Its average revenue per paying user is $111.91.

There has been a debate about whether to value Dropbox, which has a freemium model, as a consumer company or an enterprise business. It has convinced just 11 million of its 500 million registered customers to pay for its services.

Dropbox “combines the scale and virality of a consumer company with the recurring revenue of a software company,” said Bryan Schreier, a general partner at Sequoia Capital and board member at the company. He said that now was the time for Dropbox to list because “the business had reached a level of scale and also cash flow that warranted a public debut.”

He also talked about the early days of Dropbox pitching at a TechCrunch event in 2008 and how disappointed they were that the slides stopped working during the presentation. The company has come a long way.

Sequoia Capital owned 23.2% of the overall shares outstanding at the time of the IPO. They shared Dropbox’s original seed pitch from 2007. 

Accel was the next largest shareholder, owning 5% overall. Sameer Gandhi made the investment at Sequoia and then invested in Dropbox again when he went over to Accel.

Founder and CEO Drew Houston owned 25.3% of the company.

Greylock Partners also had a small stake. John Lilly, a general partner there, said he “invested in Dropbox because Drew and the team had an exceptionally clear vision of what the future of work would look like and built a product that would that meet the demands of the modern workforce.”

The prospectus warned of the competitive landscape.

“The market for content collaboration platforms is competitive and rapidly changing. Certain features of our platform compete in the cloud storage market with products offered by Amazon, Apple, Google, and Microsoft, and in the content collaboration market with products offered by Atlassian, Google, and Microsoft. We compete with Box on a more limited basis in the cloud storage market for deployments by large enterprises.”

Note that they downplayed their competition with Box, a company that’s often mentioned in the same sentence as Dropbox. While the products are similar, the two have different business models and Dropbox was hoping that this would be respected with a better revenue multiple. If the first day is any indication, it looks like that strategy worked.

The company listed on the Nasdaq, under the ticker “DBX.”

We talked about Dropbox’s first day and the outlook for upcoming public debuts like Spotify on our “Equity” podcast episode below. We were joined by Eric Kim at Goodwater Capital.

Storytelling for B2B startups: Avoiding ‘buzzword bingo’ to make your wonky enterprise company worth talking about

If there’s one thing I learned from my time as both a journalist at The Wall Street Journal and Forbes and, now, advising a global venture capital firm on communications, it’s that storytelling can make or break a company.

This is especially true the more complicated and arcane a company’s technology is. Stories about online-dating and burrito-delivery apps are easily understood by most people. But if a company specializes in making technology for hybrid-cloud data centers, or parsing specialized IT alerts and cybersecurity warnings, the storytelling task becomes much harder — but, I would argue, even more important.

Sure, a wonky company will still be able to talk easily to its customers and chat up nerdy CIOs at trade shows. But what happens when they raise a Series C or D round of financing and actually need to reach a broader audience — like really big, potential business partners, potential acquirers, public investors or high-level business reporters? Often, they’re stuck.

It can be painful to watch. When I was a reporter, I was amazed at the buzzwords thrown at me by some technology companies trying to get me to write about them. For fun, my colleagues and I would put some of these terms into online “buzzword bingo” websites just to see what indecipherable company descriptions they would spit out. (Example: “An online, cloud-based, open-source hyperconverged Kubernetes solution.”) Often, when pressed, PR representatives couldn’t explain to me what these companies actually did.

These companies obviously never made it into my stories. And I would argue that many of them suffered more broadly from their overall lack of high-profile press coverage; large business publications like the ones for which I worked target the very big-company executives and investors these later-stage startups were trying to reach.

Now, of course, I’m on the other side of that reporter/company equation — and I often feel like a big chunk of my job is working as a technology translator.

A natural-born storyteller

So why is this B2B storytelling problem so common, and arguably getting worse? Lots of reasons. Many of these hard-to-understand companies are founded by highly technical engineers for whom storytelling is (not surprisingly) not a natural skill. In many cases, their marketing departments are purely data-driven, focused on demand generation, ROI and driving prospects to an online sales funnel — not branding and high-level communications. As marketing technology has gotten more and more advanced and specialized, so have marketing departments.

As a result, many B2B and enterprise-IT companies are often laser-focused on talking about their products’ specific bells and whistles, staying in “sell mode” for a technical audience and cranking out wonky whitepapers and often-boring product press releases. They’re less adept at taking a step back to address the actual business benefits their product enables. Increasingly, this tech-talk also plays well with the legions of hyper-specialized, tech-news websites that have proliferated to serve every corner of the technology market, making some executives think there’s no need to target higher-level press.

Everyone has a story to tell. It’s up you to figure out what your company’s is, and how to tell that story in a compelling, understandable fashion.

One prominent marketing and PR consultant I know, who has worked with hundreds of Silicon Valley startups since the 1980s, says she is “shocked” by how poorly many senior tech industry CEOs today communicate their companies’ stories. Many tend to “shun” communications, considering it too “soft” in this new era of data-obsessed marketing, the consultant Jennifer Jones, recently told me. But in the end, poor communications and storytelling can create or exacerbate business problems, and often affect a company’s valuation.

So how do you get to a point where you can talk about your company in plain terms, and reach the high-level audiences you’re targeting?

One tactic, obviously, is to ditch the jargon when you need to. The pitch you use on potential customers — who likely already have an intimate understanding of your market and the specific problems you’re trying to solve — is not as relevant for other audiences.

A big fund manager at Fidelity or T. Rowe Price, or a national business journalist, probably knows, for example, that cloud computing is a big trend now, or that companies are buying more technology to battle complex cybersecurity attacks. But do they really understand the intricacies of “hybrid-cloud” data center setups? Or what a “behavioral attack detection solution” does? Probably not.

The David versus Goliath angle

Another tip is to put your company story in a larger, thematic context. People can better understand what you do if you can explain how you fit into larger technology and societal trends. These might include the rise of free, open-source software, or the growing importance of mobile computing.

It’s also helpful to talk about what you do in relation to larger, more established players. Are you nipping away at the slow-growing, legacy business of Oracle/EMC/Dell/Cisco? As a journalist, I once wrote a story about a small public networking company called F5 Networks that specialized in making “application delivery controllers.” But the story mostly focused on F5’s battle with a much larger competitor; in fact, the editors titled the story “One-Upping Cisco.” That’s the angle most readers were likely to care about. Journalists, particularly, love these David versus Goliath type stories, and national business publications are full of them.

Start focusing on high-level storytelling earlier, not when you’ve already raised $100 million in venture funding and have several hundred employees.

Another key storytelling strategy is leveraging your customers. If your business is boring to the average person, try to get one of your household-name customers to talk publicly about how they use your technology. Does your supply-chain software help L’Oréal sell more lipstick, or UPS make faster package deliveries?

One of our portfolio companies had a nice business-press hit a few years ago by talking about how their software helped HBO stream “Game of Thrones” episodes. (The service had previously crashed because too many people were trying to watch the show.) You can leverage these highly visible customers for case studies on your website. These can be great fodder for your sales team as well as later press interviews, as long as they’re well-written and understandable. Try to get more customers to agree to this type of content when you sign the contract with them.

From “Mad Men” to math men

Finally, there’s the issue of marketing leadership inside tech companies. In my experience, most smaller, B2B or enterprise IT-focused startups have CMOs or VPs of marketing who are more focused on data and analytics than brand communications — more “math men” than “Mad Men.” This isn’t surprising, as these companies often sell data-rich products and have business models where PR and general advertising don’t directly drive sales (unlike, say, a company making a food-delivery app). The CEOs of these companies value data and analytics, too.

I encourage B2B tech CEOs to focus on hiring CMOs with some brand/communications experience, or at least a willingness to outsource it to competent partners who are experts in that area. After a couple of early rounds of funding, you should be outgrowing your highly specialized PR firm (if you even have one) that focuses on a narrow brand of trade publications, for example. These firms usually don’t have contacts at the bigger, national business and technology outlets that are read by big mutual fund managers, and the business development folks at Cisco or Oracle. Hiring ex-journalists — not technical experts — to write content and develop messaging can be a good idea, too.

In other words, start focusing on high-level storytelling earlier, not when you’ve already raised $100 million in venture funding and have several hundred employees. By that point, it can simply be too late: Your company has already been typecast by the trade press and written off by higher-level reporters, and sometimes even potential business partners, as too niche-y and hard to understand.

As a journalist, I learned that everyone has a story to tell. It’s up you to figure out what your company’s is, and how to tell that story in a compelling, understandable fashion. If you do, I’m pretty sure the business benefits will follow.