Securing GenAI: The Crucial Role of Security in Cloud Services


Posted on by Bryan Wise

The advent of generative AI, (GenAI), is contributing to a boom in cloud computing services — an industry that could reach $679B, a 20% jump from 2023, according to Gartner. The interplay between cloud and GenAI is complex, and it presents businesses with new opportunities and challenges. 

With 80% of enterprises expected to be using GenAI applications by 2026, it’s important that business leaders understand how GenAI and the cloud interact to ensure your data is secure, your AI output is trustworthy, and your security and compliance efforts evolve apace with this quickly changing field.

Why Cloud Services are Increasingly Important

GenAI is accelerating the shift to cloud-centric infrastructure because it demands substantial computational resources for training and inference processes. Absence of cloud would require large upfront investment and infrastructure.

Cloud services offer storage solutions for managing extensive datasets, AI services and APIs for integrating generative capabilities into applications, and cost-effective pricing models, making cloud computing indispensable for harnessing the full potential of GenAI across various domains and industries.

In addition, GenAI requires scalable infrastructure, since computational needs vary greatly throughout the training and use cycles. Cloud computing provides the flexibility to provision resources on-demand, ensuring organizations can efficiently train and deploy GenAI without the constraints of local hardware.

How AI and the Cloud Complement Each Other

GenAI and the cloud have a mutually beneficial relationship. When paired together, they enable new capabilities for content generation, personalization, resource optimization, and collaboration. 

Content generation: Certainly, the most well-known and battle-tested use for GenAI is to generate content — text, images, audio, and software code. This capability opens up countless possibilities in cloud applications, such as chatbots, content creation tools, data synthesis, and more. Cloud providers can offer these GenAI capabilities as services, allowing developers to integrate them into their applications without needing to develop or train their own models.

Personalization: GenAI can be used to personalize content and user experiences in cloud applications. GenAI employs Large Language Models (LLMs), potentially in conjunction with Retrieval-Augmented Generation (RAG) techniques, to provide personalized experiences for end-users. LLMs are trained on vast datasets and generate text-based responses, while RAG optimizes these responses by referencing external knowledge bases, ensuring accuracy and relevance without retraining the model.

Security First

As companies expand their reliance on the cloud, they face new vulnerabilities.

Here are some things to consider:

  • Understand your data and model training: Be aware of potential privacy and security risks associated with data and model training when using LLMs and RAG techniques in the cloud. Robust encryption mechanisms are essential to protect sensitive data used for training the models, especially when stored or processed in cloud environments. Additionally, security measures such as access controls, monitoring, and auditing should be implemented to mitigate the risk of unauthorized access, data breaches, or misuse of AI-generated content.

  • Secure development practices: As with any development process, the Secure Software Security Development LifeCycle (SSDLC) should be a part of GenAI applications. This includes adhering to secure coding practices, conducting regular security assessments and code reviews, and ensuring third-party libraries and dependencies are kept up to date. Re-examine SSDLC practices frequently to ensure they account for the shifting landscape of GenAI. 

  • Network security: Secure network configurations and monitoring protect cloud-based GenAI applications from external threats. This includes implementing firewalls, intrusion detection and prevention systems (IDPS), and network segmentation to control and monitor traffic flow between different components of the application.

  • Compliance and governance: Companies must ensure that their use of GenAI in the cloud complies with relevant regulations and industry standards governing data privacy and security. This may include GDPR, HIPAA, CCPA, or industry-specific regulations. State laws may also apply if you’re processing data using AI. Establishing robust governance frameworks and conducting regular compliance audits can help ensure security and privacy requirements are met.

  • GenAI as safeguard: We hear a lot about how GenAI is making it easier for bad actors to breach security — for instance, with high volumes of contextual and convincing phishing emails — but it can play offense as well. One example, It can be used to increase efficiency in various phases of penetration testing, such as reconnaissance, discovery, and vulnerability analysis.

Conclusion

As companies adapt to new technologies like GenAI, it is becoming more evident that cloud services will be the foundational element of the tech stack for the very long-term. By addressing security considerations, leaders can mitigate risks so they can leverage the scalability, flexibility, and services provided by cloud platforms to unlock the full potential of GenAI. With the rate of change accelerating in the field of GenAI, this space is incredibly dynamic. Security teams need to build processes for frequent review and iteration to ensure they keep up with the changes.

Contributors
Bryan Wise

Chief Information Officer, 6sense

Cloud Security

cloud security Cloud Infrastructure Artificial Intelligence / Machine Learning data security application security content filtering risk management network security governance risk & compliance

Blogs posted to the RSAConference.com website are intended for educational purposes only and do not replace independent professional judgment. Statements of fact and opinions expressed are those of the blog author individually and, unless expressly stated to the contrary, are not the opinion or position of RSA Conference™, or any other co-sponsors. RSA Conference does not endorse or approve, and assumes no responsibility for, the content, accuracy or completeness of the information presented in this blog.


Share With Your Community

Related Blogs