/ Can Machines Dream of Secure Code? Let’s hack a Node.js and a React app

Description

Did you find yourself asking ChatGPT to help you write Fastify route logic or let GitHub Copilot auto-complete your React component code? Like all of us who jumped onto the GenAI bandwagon. But, did you know that LLMs can hallucinate insecure code? We excitingly rush to adopt AI code generation tools but what exactly do we trade off? Writing secure code is tougher than it seems and we humans are getting it wrong time and time again. Machines get it wrong too and ChatGPT or GitHub Copilot are no exception. Let me show you how the machines fail to produce secure code, whether it is Node.js, Python or React - they expose you to security vulnerabilities and introduce the need to secure your AI generated code.

Session 🗣 Intermediate ⭐⭐ Track: FrontEnd (TypeScript, JavaScript, AngularJS, ReactJS, ....)

Security

GenAI

AppSec

This website uses cookies to enhance the user experience. Read here