The U.S. Government’s long road to adopting the cloud

The seven-year history of the evolution of government cloud, from “Cloud First” to FedRAMP Tailored.
Part of
Issue 2 July 2017

Cloud

On December 9th, 2010, U.S. Federal Chief Information Officer Vivek Kundra told his government peers that they would never work the same way again. Nearly two years after President Obama signed a pair of executive orders on his first day in office promising a new era of government transparency and disclosure, Kundra gave a presentation reinforcing a new “Cloud First” policy that sought to harness the increasingly powerful remote processing model to hew down bloat and increase efficiency. It pushed each agency to transition some services to the cloud within the year and authorized an exchange program to borrow Silicon Valley’s best talent.

This motion was one of the first unified, from-the-very-top motions encouraging all federal agencies to begin transitioning some of their services into cloud computing. Not that they were ready to jump on the cloud bandwagon and begin offloading their computation and data storage to commercial vendors: agencies large and small still had to take a long look at their needs to decide how to shift their infrastructure from in-house IT to external providers. Kundra’s vision was more aspirational than immediately instructive; many agencies are still in this process nearly seven years later.

“It was an early signal of the ultimate direction, but it was a little too early for the government to embrace. There was a pretty steep learning curve ahead for the federal government,” said Dr. Rick Holgate, analyst for Gartner and former Chief Information Officer (CIO) for the Bureau of Alcohol, Tobacco and Firearms. “It was aspirational more than watershed because it didn’t necessarily make it easier for [the] federal government to move to the cloud.”

Government agencies had been contracting third party vendors for cloud computing services for years, but which was the first to do so might be lost to time—and semantics. What we know as cloud computing today has changed and evolved over the years, ever since vendors sold the first remote processing services to government agencies.

“The government’s been involved in clouds for eight years, but when you get beyond eight years or so, it becomes a whole taxonomy discussion about what cloud is,” said Shawn P. McCarthy, Research Director at analysis firm IDC.

Cloud computing is still a young field, but its terminology has somewhat solidified. The definition, as finalized in 2011 by the National Institute of Standards and Technology (NIST) after 15 prior versions, is “a model for enabling ubiquitous, convenient, on-demand network access to a shared pool of configurable computing resources (e.g., networks, servers, storage, applications and services) that can be rapidly provisioned and released with minimal management effort or service provider interaction.”

Today, government agencies contract external vendors for cloud computing solutions for the same reasons enterprise clients do: to entrust critical systems to providers who will maintain, modernize, scale, and reliably keep them online. There are a few more hoops to jump through if a company wants to sell their service to a government agency—hoops which differ depending on whether the company is selling to the federal, state, or local levels. But, so long as they meet requirements and become authorized by single agencies or broad security baselines, any vendor can (theoretically) pitch their products to agencies, from single Software as a Service (SaaS) solutions to Platform as a Service (PaaS) offerings all the way up to selling Infrastructure as a Service (IaaS) that agencies build entire systems on.

The potential for a cloud computing company to open up their business to the government market is enticing, but meeting the government’s safety guidelines can be a lengthy process. Is jumping through all these regulatory hoops worth it to businesses? Usually yes, McCarthy said, though qualifying to offer products— the coveted authority to operate (ATO)—doesn’t guarantee companies a single customer.

“A lot of people do it because, by going through the approval process and getting [General Services Administration] scheduled, you are listed as a serious player on the federal market. That said, government buyers look at price points and whether the product really meets their requirements—they could have tens to hundreds of boxes to check. They may never need the product you went and got approved,” said McCarthy. “On the flip side, if you have a wonderful solution, you have access to thousands of potential new clients. It’s not a magic bullet to get GSA scheduled, but for most people, we do recommend they jump through the hoops—but keep an open mind.”

From FISMA to FedRAMP: A decade and a half of requirements and acronyms

Where cloud computing started is fodder for another piece, but one of the earliest vendors to start selling to the government was RightNow, founded by Greg Gianforte (yes, the same newly-elected Representative of Montana). By one of that company’s manager’s recollection, they started hosting clients’ computing needs on their own servers shortly after 2000, during some of the first nascent moments of cloud computing.

Then, at the tail end of 2002, Congress passed landmark legislation that codified digital security practices for government agencies’ IT setups. The E-Government Act of 2002 established the Office of Electronic Government under the Office of Management and Budget and the person heading it, its Federal Chief Information Officer of the United States. The first of these, Vivek Kundra, held the position for eight years.

The E-Government Act had other provisions, but most germane to cloud computing was the Federal Information Security Management Act of 2002 (FISMA), which requires each agency to develop its own Information Security protocols according to how much risk they deemed appropriate. Agencies, or the vendors they contracted to build their systems, had to ensure that their services were FISMA-compliant—or, more specifically, FIPS-compliant (Federal Information Processing Standard). NIST requirements 199 and 200 are two of the mandatory security standards that rolled out over the next few years. In short, these are the specific security requirements government bodies select according to their needs, with increasing levels of severity, that vendors must meet.

As defined, these were arbitrary requirements set up by each department for all of their IT systems—which, through most of the 2000s, was mostly done in on-premises data centers. But cloud services were gaining traction in the enterprise sphere. Nearly two years after Obama was inaugurated, the Office of Electronics in Government gave an official nudge to push agencies into the cloud computing game. In December 2010, Federal CIO Kundra presented the “25 Point Implementation Plan to Reform Federal IT Management,” which set bold goals to reduce tech infrastructure bloat, which could partially be solved by migrating operations to the cloud. It challenged government bodies to cut or turnaround a third of their underperforming projects within 18 months and, under a new “Cloud First” mentality, shift at least one system to the cloud in the next year. Meanwhile, the plan proposed for the Office of Management and Budget to oversee cutting 800 federal data centers government-wide by 2015. Soon, the government had a bona-fide cloud computing strategy.

At the time, this was still somewhat early—a goal-setting aspiration rather than a comprehensive plan to organize existing activity. Agencies weren’t ready to immediately sign their operations over to a vendor: Logistically, there was plenty of work ahead to vet vendors, plan migration strategies, and prioritize which ones should make the jump first. By the next year, agencies had started following the IT Reform motion in their own backyards. The Department of Defense had closed eight data centers, with 44 total slated to sunset by the OMB’s target date in 2015. The DoD had also started its own program to invest in Silicon Valley’s solutions by introducing its IT Exchange Program, borrowing professionals for 3-month to year-long stints to learn industry best practices.

As 2011 came to a close, Kundra left his position as federal CIO, and his successor Steven VanRoekel introduced the second major security framework, the Federal Risk and Authorization Management Program (FedRAMP), which operates under the General Services Administration (GSA). While agencies had set their own IT security assessment methodologies following FISMA’s loose guidance, FedRAMP outlined security protocols specifically for agencies engaging in cloud services.

FedRAMP mandates that agencies use NIST SP 800-53 as a set of guidelines, among other security controls, and these apply to any cloud service providers (CSPs) that want to sell services to agencies. Ergo, CSPs would now have to be “FedRAMP compliant,” which is an authorization that, once met and approved by the FedRAMP board, qualifies cloud products for theoretically any agency (rather than getting FISMA-authorized on an agency-by-agency basis). This was a huge leap, and only made possible because IT security is a universal concept. For the first time in government history, every agency had been made to abide by one set of security protocols.

CSPs aspiring for FedRAMP approval submit their products to a review board known as the Joint Authorization Board (JAB), which is made up of CIOs from the DHS, DoD, and the GSA, which examines every potential service. Given FedRAMP’s nominal personnel, it only authorizes 12-14 CSP services per year. But there is an alternative, which is occasionally faster: CSPs can have agencies themselves run through the same vetting process, and after approval by JAB, the service receives the same FedRAMP seal of approval. Often, this is quicker: about two-thirds of the 86 FedRAMP-approved services were authorized through agencies.

Those companies which have earned FedRAMP compliance have their products listed in an online catalogue, a site that also tracks which requests for authorization are still under review. Most notably, FedRAMP only applies to unclassified data: Agencies dealing with classified data, including those in the intelligence community, still retain their own secretive security protocols.

But the need for cloud services that traffic unclassified data is huge. The not-so-secret secret of government bodies’ cloud computing requirements is that they need many of the same things that commercial businesses do, and for the same reasons.

“Agencies tend to prefer an enterprise solution when they can, when it makes sense,” said McCarthy.

Offloading responsibility to develop and maintain IT to a cloud computing provider has the same appeal to a government client as to an enterprise one: Flexible and custom solutions that can be scaled. But like commercial clients, there’s no singular “government cloud” that agencies work within, and each FedRAMP-compliant vendor offers very different solutions.

“It really is an ecosystem of providers that are different from last year or even two years ago and that provide different levels of cloud service,” said Bryna Dash, Director of Cloud Services at IBM.

Smaller, more agile providers have been selling cloud services to the government for years, but the larger hosting providers were some of the first to be officially welcomed into the FedRAMP pantheon. Around 2013, the nascent Amazon Web Services became the first of them to pass FedRAMP and NIST security regulations, McCarthy reckoned, and that sent a signal that moving to the cloud was worthwhile.

“When AWS became a robust, respected platform for government computing and nailed down its very stringent requirements, that sent a message to other agencies that cloud is here in a way that’s highly secure and tends to be cheaper,” said McCarthy. “Suddenly you have agencies with the most stringent security requirements you can imagine, and they’re suddenly getting what they need in AWS.”

The turning point: 2013

Amazon’s FedRAMP authorization was one of the main reasons analysts deem 2013 to be a real watershed year for government’s involvement in cloud computing. Shortly thereafter, Amazon’s product dedicated to the government cloud set up interactions with the federal intelligence community, and meeting their strict security requirements with a cloud product was an accomplishment.

Amazon barely beat Microsoft in the race to pass government regulations. Others followed, including IBM, which was officially cleared to sell cloud computing services to government bodies in November 2013. By the next year, they’d already opened the first of several data centers opened specifically for use by government agencies—dedicated data centers that were physically and proximally isolated from civilian or enterprise data being a potential requirement for some agencies.

The additional resources Amazon and Microsoft have invested into their government cloud offerings has likely given them an edge when competing for contracts: Back in early 2013, despite a competitive bid by IBM, who was offering a less-expensive solution, the CIA chose Amazon to build cloud infrastructure because Amazon’s bid offered a “superior technical solution.” In this sense, what Amazon and Microsoft can offer—what they can build for agency clientele due to their extensive investment—gives them an advantage. While the two are the dominant operators in the government cloud, they aren’t alone, owning nine of the 86 FedRAMP-approved offerings in the GSA catalogue. (You might wonder where Google is in all this: despite getting onboard with Federal CIO Kundra’s attempts to launch app marketplace Apps.gov back in 2009 and applying for FISMA approvals, Google’s portion of the government cloud computing market is a “distant third” according to Gartner analyst Holgate.)

As befits a cybersecurity landscape that continues to evolve, the government’s cloud computing vendor requirements are also changing. FISMA, for example, was amended in 2012 and updated/modernized in 2014. Whenever NIST updates SP 800-53, FedRAMP updates along with it. And security requirements haven’t just evolved—they’ve expanded. FedRAMP launched with Low and Moderate security category clearances, which require providers to satisfy 125 and 326 controls, respectively. The Department of Defense even has its own particular set of security controls that it started transitioning from its own security protocols to FedRAMP into a new authorization called FedRAMP Plus, which launched in the middle of 2016 (though the DoD, not JAB, still oversees these protocols).

A year ago, FedRAMP finally released its High security category requiring CSPs to satisfy 421 security controls, an authorization that finally permitted commercial vendors to agencies looking for particularly sensitive solutions. Unsurprisingly, Amazon, Microsoft and CSRA received approval to operate at the High level in June 2016, and remain the only three companies to offer products in the FedRAMP catalogue at such a level (though more are waiting in the FedRAMP review queue).

Regardless of the currently small number of providers and the extensive vetting for approval, the government cloud is growing. Total “cloud spend” was expected to reach $37.1 billion in 2016, according to an IDC estimate; by 2020, the analysis firm forecasts that spending on cloud services will almost equal budget expenditures on traditional IT.

This year’s forecast of government spending on cloud computing, however, was a shocking decrease from 2016. Initial estimates by IDC anticipate a 16 percent drop in budgeted expenditure for cloud solutions. McCarthy believes this is because new projects are already developed on the cloud, while the easier old things to transition to the cloud like email systems, storage, and websites have already been uploaded. The remaining systems are transitioned on a case-by-case basis. But he doesn’t believe this is necessarily an end to growth in government spending on the cloud.

“I consider that a temporary blip,” said McCarthy. “So, 16 percent budgeted less than last year—and in reality, probably less than that. Because the fiscal year ends in September, we won’t hear the real numbers until October or later.”

And then there are the cloud computing applications that haven’t been conceived yet.

“My perspective is that we’re still in the early phases of cloud adoption in general. The ability to take advantage of technologies that are forward-thinking, to take advantage of the cloud, plus the things we don’t know about that will emerge in two or three or five years—the long-term benefits of cloud adoption,” said Greg Souchack, IBM Federal Partner for Cloud and Managed Services.

Over the last few years, another concept has entered the government cloud conversation: open standards. It wasn’t always: After all, it’s easier to retain customers if your solution is proprietary and inflexible. But the tide has shifted in the last few years. Open standards proponents like those at IBM champion the practice as not just unshackling clients from dependency, but embracing the free flow of information.

“Point is, if they move to another cloud provider, they aren’t reinventing the wheel” said Lisa Meyer, Federal Communications Lead at IBM.

“If you build on one cloud provider, you should be able to move between cloud providers and different vendors, shift to where innovation is and, quite frankly, to the right price point,” agreed IBM’s Dash. “That maintains competition.”

But to get more competition, FedRAMP will need to move faster in approving requests. The small-staffed FedRAMP review board had typically taken 12 to 18 months to authorize a cloud service. With 67 CSPs still waiting in the review queue, FedRAMP isn’t ignoring its slow approval process. They’ve worked on (relatively) speeding up the process: In the last round, FedRAMP required all applicants to submit by December 30th, 2016, announced those who’d earned review in late February 2017, and expect to confirm the qualified in 4-5 months.

FedRAMP has also floated the idea of a new classification, FedRAMP Tailored, which would ease control requirements on a case-by-case basis for proposed systems that are deemed low-risk and cheap to implement: services like collaboration tools, project management, and open-source development. Once approved for consideration by the Joint Authorization Board, the low-risk services could be given an ATO in just over a month. Despite its low-risk classification, services approved for Tailored use are still granted FedRAMP’s pan-agency seal of approval, allowing the CSP to sell it to other government bodies.

Which is a good thing, since the government cloud’s appetite is increasing. 112 different agencies are currently using FedRAMP-compliant cloud services in a mix of PaaS, IaaS and SaaS products. But over 85 percent of the CSP products awaiting JAB review are SaaS, which will logically build on top of the IaaS and PaaS foundations that agencies have been building on for years.

Back in June, the White House held a technology summit to discuss modernizing the government’s technology—and ahead of it, the administration’s director of strategic initiatives Chris Liddell noted that only “three to four percent” of government operations are currently on the cloud. Liddell, the former chief financial officer at Microsoft, said that the White House’s goal for the summit was to cultivate a “government tech” industry in the private sector. Whether that nudges disparate companies into a coalesced niche more visible to the main cloud market, and to the public at large, is anyone’s guess.

About the author

David J. Lumb is a New York-based journalist covering the intersection of technology, business, and culture. He is a mobile editor at TechRadar and has written for EngadgetFast CompanyPlayboy, and Popular Mechanics in addition to Increment.

@OutOnALumb

Buy the print edition

Visit the Increment Store to purchase print issues.

Store

Continue Reading

Explore Topics

All Issues