Micro Focus is now part of OpenText. Learn more >

You are here

You are here

Going serverless? Rethink your data security approach

public://pictures/dutta.png
Sid Dutta Sr. Data Security Products Executive, CyberRes
public://pictures/usman.jpg
Usman Shafique Software Engineer, Security R&D, CyberRes
 

Serverless and function as a service (FaaS) are hot terms in the software architecture world these days. All three major cloud service providers, or CSPs (Amazon, Microsoft, and Google), are heavily invested in serverless.

But what exactly constitutes serverless computing? Also, what is the difference between serverless and FaaS? Most importantly, how secure is serverless, and is your data safe within this architecture?

Here's what you need to understand about serverless and FaaS—and how to keep your data secure in these environments.

What is serverless?

Serverless is a cloud-native development model that enables enterprises to build applications faster by eliminating the need to manage infrastructure. With serverless applications, the CSP automatically provisions, scales, and manages the infrastructure required to run the code. The term serverless computing can be misleading. It doesn't mean computing without a computer. Rather, computing power can be used without explicitly allocating a server by cloud users.

The tasks associated with infrastructure provisioning and management are abstracted away from the cloud user. The CSP handles the routine work of provisioning, maintaining, and scaling the server infrastructure, thereby enabling cloud users to focus only on their application's business logic. Serverless computing allows CSPs to monetize unused infrastructure while also enabling cloud users to quickly run code without paying for a provisioned VM allocated just for them. It's a win for the CSP and the customers.

Once deployed, serverless apps respond to demand and automatically scale up and down as needed, since FaaS is metered on demand and is usually there when you need it. Once executed, it is no longer charging you.

An overview of serverless architecture

Under a serverless model, a cloud provider runs servers and dynamically allocates their resources to users who want to deploy code. Serverless computing typically falls into two groups, back end as a service (BaaS) and FaaS. BaaS gives cloud users access to various services and automates and manages the back-end side of a web or mobile application.

A cloud provider may offer authentication and encryption services, cloud-accessible databases, and usage data. With BaaS, serverless functions are called through application programming interfaces (APIs). More commonly, when cloud users refer to serverless, they're talking about a FaaS model.

FaaS is a platform that lets you run self-contained functions (code snippets) in the cloud. It's an event-driven computing execution model where developers write application logic that is deployed in containers fully managed by a platform, then executed on demand. FaaS is trigger-based and runs dynamically in response to an event.

The functions are (and should be) designed to do a single piece of work, making them lightweight and quickly executable. In contrast to BaaS, FaaS offers a greater degree of control to users, who create custom apps rather than relying on a library of prewritten services. The containers where these code snippets are deployed are:

  • Stateless, making data integration simpler
  • Ephemeral, allowing them to run for a short time
  • Event-triggered, so they can run automatically when needed
  • Fully managed by a CSP, so that you only pay for what is needed, not for always-on apps and servers
  • Inherently elastic, scaling automatically without needing to set up auto-scaling groups, etc.

In a FaaS environment, apps are launched only as needed. When an event triggers app code to run, the CSP dynamically allocates resources for that code. The user stops paying when the code finishes executing. In addition to the cost and efficiency benefits, serverless frees cloud users from routine tasks associated with app scaling and server provisioning, as well as tasks such as managing the operating system and file system, installing security patches, load balancing, managing capacity, scaling, logging, and monitoring.

Why is serverless popular, and what is its value proposition?

The breakdown above of what serverless and FaaS are, gives a clear picture of the various advantages they provide to organizations:

  • Cost: You only pay for the service you use. No up-front billing costs to provision resources, and no up-front capacity planning required. Serverless offerings from CSPs are usually metered on demand through an event-driven execution model. As a result, when a serverless function is sitting idle, it doesn't cost anything
  • Inherently scalable: Scaling is effortless and managed by CSPs.
  • Simplified code: With FaaS, you have the option to upload one function at a time or your entire application all at once
  • Faster time to market: FaaS supports iterative development, getting applications up and running sooner, and allows you to make modifications more easily

The security advantages of FaaS

There are also several security-related benefits to be gained from the FaaS model:

  • No need to manage OS patches: When using FaaS, the underlying platform handles the servers for you, offloading the need to provision, manage, and monitor all the running servers. By offloading the servers from you, FaaS also takes on the responsibility for patching those servers. However, note that the CSPs will patch these vulnerabilities on their own schedule, so users should be aware of the environment and conditions their code is executing under.
  • Short-lived execution: One of the key requirements to run your code in a FaaS environment is that your function needs to be stateless and short-lived. In a FaaS environment, you don't know which server is assigned to run your function. The platform provisions and de-provisions servers as it sees fit, destroying servers immediately after the function is executed. In most attacks, a vulnerability is first found, exploited, and followed by malicious software installed on the target machine.

    Malicious software needs to work quietly to gain as much information as possible, which takes some time. A serverless platform doesn't give attackers the luxury of time, since most FaaS environments have a short time limit during which the spun-up environments will be up. By repeatedly resetting, a serverless environment eliminates any compromised server. Stateless and short-lived systems, including all FaaS functions, are therefore inherently less likely to be compromised at any given point in time.
  • Resistance to denial-of-service (DoS) attacks: The same scalability that helps handle good demand can also cope with its negative equivalent. Attackers often try to take down systems by submitting a large volume of memory-intensive actions, maximizing server capacity, thereby keeping legitimate users from using the application. These DoS attacks are naturally thwarted by the (presumed) infinite capacity serverless offers. More requests, whether good or bad, would make the platform provision more servers. However, there is a cost associated with all of those executions, so monitoring of such activity should still be performed.

Downsides and security concerns of FaaS

Although a serverless environment provides a huge upside, the downsides to using serverless/FaaS need to be considered before going all-in on that model. Some general downsides of serverless/FaaS environments are:

  • Vendor lock-in: Building your application on a FaaS platform may make you reliant on that vendor and make it difficult to switch
  • Cold starts: FaaS execution environments are not in standby mode. There's sometimes a delay in the execution of a function that can adversely impact your application.
  • Short-lived: FaaS is designed to scale up and down in response to workload, which provides excellent cost savings. However, it is not meant for long-running processes, and therefore, the same cost advantages do not apply.
  • Security issues: FaaS environments across the three major CSPs continue to be targets of security attacks. Most recently, Azure Functions was exploited, allowing the attacker to escalate privileges and escape the Docker container running the code to the Docker host.

FaaS environments are also not without their security weak points. As mentioned, FaaS frees organizations from OS patching. Since the OS is unreachable, attackers will shift their attention to the areas that remain exposed, with the application itself being the primary target. With this approach in mind, here are some data-centric security concerns to consider when moving to a serverless environment:

  • Expanded attack surface: Serverless functions consume data from various event sources such as APIs, message queues, cloud storage, etc. The attack surface of your software environment consists of all the points through which an unauthorized user can enter or extract data. Serverless systems are composed of a large number of components. New entry points for malicious and unauthorized users are added with each new tool, service, or platform integrated into the ecosystem. Every time your architecture is scaled and condensed, the attack surface changes. Each of these different source types can contain untrusted or attacker-controlled input.
  • Stateless server security: Even though functions are stateless, an app's logic often requires data. In a stateful application, such information stays on the machine handling the request, sometimes even staying in memory and off disk. In a stateless function, however, external storage is used to persist it across calls. The performance implications of not having the data on the same machine are usually small, but storing sensitive data outside the server has significant security implications. At two points, the data is at risk: when the data is being transferred, and when the data store is itself compromised. Simply put, data stored outside the machine is at higher risk than data stored within it.

How to mitigate risk in FaaS environments, and why data-centric security is key

In the early days, security was a big reason for organizations to be reluctant to move to the cloud. Over time, it became apparent that CSPs can do a better job when it comes to securing infrastructure. However, the security model that's evolved now for cloud services is a shared one. Shared responsibility means a cloud security provider will be responsible for the security of the cloud, while the customers are responsible for the security in the cloud and the data they put in it. This is known as the "shared responsibility model."

Along with this shared responsibility model, the security challenges mentioned above, and the importance of data security growing to be one of the highest priorities, how do enterprises ensure the security of their data across a hybrid approach with multiple CSPs? Enterprises should have a strategy for cloud data security in place, preferably before any sensitive data is even moved to the cloud, since the customers themselves are ultimately responsible for their data's security under the shared responsibility model.

Furthermore, since over 90% of enterprises follow a hybrid workload approach, an overall enterprise data security strategy is critically necessary to secure data across all hosted platforms. This strategy should maintain your data security while at rest, in transit, during migration, and in use.

Such a strategy requires at a minimum the following criteria to avoid the data security limitations of FaaS environments:

  • Avoiding vendor lock-in: Select a solution that applies to all hosted locations. Specifically for FaaS environments, this means keeping your data protected while in use and/or in storage using key management and crypto services agnostic to any platform.
  • Control of encryption keys: Enterprises should have complete control of encryption keys used to encrypt their data
  • Stateless/platform-agnostic security services: Any crypto and key management service can be used across all platforms, regions, etc.

The best way to mitigate having any sensitive data compromised is by ensuring that data is protected using industry-standard encryption over its entire lifecycle, i.e., from the moment it is captured. Look for integrations with native cloud services on AWS, Azure, and Google Cloud Platform to run data protection functions within serverless compute (FaaS). And demand full support for both hybrid and multi-cloud implementations.

Plus, look for solutions that tackle these issues:

  • Stateless key management: Scalable, portable, and secure key management service, allowing enterprises full control over the data encryption keys. Keys generated should work seamlessly across any platform (whether on premises or in the cloud), therefore not requiring data to be decrypted when migrating across different CSPs, regions, etc.
  • Data protected at point of ingestion. Serverless should provide enterprises the ability to run secure workloads within FaaS environments. This allows enterprises to protect their data at the point of ingestion, thereby limiting any new security concerns that may arise as part of the data migration.
  • Feature-rich crypto solutions: Opt for crypto solutions that don't just provide a variety of data protection formats such as pseudonymization, anonymization techniques such as tokenization, format-preserving encryption (FPE), format-preserving hash, etc. but also allow for business use-case capabilities on the protected data. With feature-rich, cloud-agnostic crypto services, enterprises can not only protect their data at the point of ingestion, but can also use the protected data to perform downstream use cases, including data analytics or for use in other FaaS-based functions. FPE allows enterprises to use protected data without requiring decryption.

Keep learning

Read more articles about: SecurityData Security