AI Commerce supports Composable Commerce Serverless Lamda development environment
Welcome to a comprehensive how-to article on AI Commerce 's Serverless solution, which allows partners to develop their own logic and extensions to the core of AI Commerce without having to modify the actual core code backend. This article is intended to be an in-depth guide: we'll explain the underlying architecture, why this model is useful, how to install and deploy, and what partners can do with it.
Table of Contents
Welcome to a comprehensive how-to article on AI Commerce 's new Serverless solution, which allows partners to develop their own logic and extensions into the core of AICommerce without having to modify the main backend. This article is intended to be an in-depth guide: we'll explain the underlying architecture, why this model is useful, how to install and deploy, and what partners can do with it.
1. Background: Composable Commerce and why it matters
In the Composable Commerce model, each area (such as product management, customer management, order processing, search functionality, etc.) can be implemented as separate “blocks” or micro services. These micro services communicate with each other through APIs. Instead of a traditional monolithic architecture, the Composable Commerce model offers:
- Flexibility : New features can be created faster when each part can be developed and tested independently of each other.
- Extensibility : You can add third-party services (e.g. new payment solutions, search or ERP integrations) more easily, without massive changes to the core system.
- Faster product development : Partners can develop and publish their own extensions independently, and AI Commerce manages security and infrastructure.
2. The core of the solution: AI Commerce 's Serverless Lamda environment
We have created a separate serverless environment for partner use, located in the same virtual VPC as AI Commerce core code. This way, data connections are very fast, and there is no need to transfer data over the internet between different cloud environments.
2.1 What does this mean for partners?
Access to expanded infrastructure
- Partners can deploy AWS Lambda-based microservices (functions) with direct, secure connectivity to AI Commerce 's main backend (or its database in RDS).
Self-managed code and deployment
- Partners receive their own AWS CLI keys that allow them to push their code directly into AI Commerce ’s managed environment, allowing them to flexibly develop and update their extensions without the need for AI Commerce team to intervene each time.
One GitHub repository per customer
- Each partner's customer (e.g. “Airbus” or “Boeing”) has its own GitHub repository:
extensions-airbus
,extensions-boeing
, etc. The partner maintains and pushes their code there. - AICommerce takes care of the basic configuration and resource initialization, so the partner can focus only on the business logic.
Shared domain and CloudFront
- AI Commerce configures the environment so that the end customer's domain (e.g.
boeing.com
) is backed by CloudFront, from which requests are directed to API Gateway and further to Lambda functions. - This way, partner services appear as a unified part of the AICommerce system, but still operate as physically separate microservices.
3. Use cases: When does a partner need their own serverless backend?
- ERP integrations : If the customer already has an ERP system in use, AI Commerce 's orders or product data can be easily synchronized there via a custom Lambda function.
- Search and filter services : If AI Commerce 's native search or search partners are not enough, the partner can bring in their own search engine (e.g. Elasticsearch, Solr, etc.) and replace or extend AICommerce's search functions with it.
- New features : Partner can extend order, customer or product processing with custom routings (webhooks, validations, additional logic) while maintaining full compatibility with core data structures.
4. Pricing and resource limitations
Memory limit for lambda functions
- By default, we limit Lambda function memory to 2 GB (2048 MB). Larger needs require special pricing and a separate agreement with AI Commerce .
API Gateway usage limitations
- We can set limits on the number of calls (e.g. throttling, rate limit) to prevent overload. Higher call volumes are handled on a case-by-case basis.
Partner's own AWS environment
- If the customer needs more extensive control or significantly larger capacity, a separate AWS account can be created as a “subuser”, which still has a direct VPC connection to AI Commerce .
5. Technical Implementation: Serverless Extensions in Practice
Below is an example project structure for a repository named extensions-clientName
(replace clientName
with your desired name). The purpose of the project is to provide one or more Lambda functions behind API Gateway.
5.1 Runtimes
We support the new AWS Lambda environment, which offers several different runtimes to choose from:
- .NET 8 (C#, F#, PowerShell)
- Java 21
- Node.js 22.x
- Python 3.13
- Ruby 3.3
- Amazon Linux 2023 (Go, Rust, C++, custom)
We also support other older runtimes provided by AWS, such as Java 8 - Amazon Linux 2, Node.js 18.x, Python 3.9, etc. You can choose any of these runtimes in a serverless configuration.
5.2 Project folder structure
Typical structure with Java example (Node.js, Python, .NET follow a similar pattern, only the build tools and directory names vary):
extensions-clientName/
├─ pom.xml # Maven-konfiguraatio (Java)
├─ serverless.yml # Serverless Frameworkin pääkonfiguraatio
└─ src/
└─ main/
└─ java/
└─ com/
└─ clientName/
└─ Handler.java # Päälogiikka
5.3 Installation and commissioning
Clone the repository
git clone https://github.com/petrosoft-fi/extensions-clientName.git
cd extensions-clientName
Install the necessary tools
- AWS CLI (configure keys with the
aws configure
command). - Node.js & npm (or Yarn).
- Serverless Framework globally:
npm install -g serverless
- Maven, if you develop Java:
# Java-projekteissa
mvn clean package
- In other languages, a corresponding build process (npm, pip, dotnet build, etc.).
5.4 Edit serverless.yml
- Set the desired runtime (java21, nodejs22.x, etc.).
- Change package.artifact to the built file pointed to (e.g. Java: target/extensions-clientName-1.0.0-shaded.jar).
Deploy to a Serverless environment
serverless deploy
- This creates the Lambda function(s), API Gateway routes, and configures everything on AWS.
5.5 Test
- When the deployment completes, the command line will show the API Gateway endpoint.
- Send a test request with cURL:
curl https://example.com/ext/your-endpoint
- If everything is OK, you will receive an HTTP 200 response or other configured return code.
6. Why is this model useful?
Partners get the freedom to develop
- Because they have their own GitHub repos and AWS CLI permissions, they can add new functionality and release it without AI Commerce core team having to constantly make integrations.
Security and management remain with AI Commerce team
- While partners can deploy code, AI Commerce retains full access and control over the infrastructure (VPC, IAM, RDS, etc.).
- Access rights are limited to only necessary resources.
Benefit from Composable Commerce architecture
- The partner can replace or extend AI Commerce 's features with independent microservices.
- The user experience remains consistent because the user interface built with Svelte and AWS cloud services hide the internal distributed architecture.
Scalability
- Lambda functions automatically scale according to AWS's serverless model, so increasing traffic volumes from partners do not crash the core.
Cost-effectiveness
- You only pay for what you use: if the Lambda functions are not called, there are no costs for them (except for a small amount of storage, which is practically free to use).
7. Expansion and “overwriting”
Extensions : Partners can add new logic, such as webhooks interfaces or completely new routing, allowing them to bring features to AI Commerce that are not available in the core.
Override : In some cases, it is possible to bypass AI Commerce 's native route, such as search or product filtering, by pointing the frontend directly to the partner's Lambda interface. This is the ultimate flexibility of the Composable Commerce model, where the frontend chooses which microservice to use.
8. Recommendations for large projects: ERP and team integrations
-
Different Lambda function for large projects
- If the project is large (such as an ERP integration), it often makes sense to use your own Lambda function to keep the code clear and easier to manage.
-
Several smaller functions
- Sometimes it is better to divide a large functionality into smaller pieces, with each Lambda handling one area of responsibility, for example “ERPOrdersHandler”, “ERPProductsHandler”, etc.
-
GitHub directory structure
- You can store all the functionality in one repository (e.g.
extensions-adidas/erp/…
) or in several, depending on the size of the project and release cycle.
- You can store all the functionality in one repository (e.g.
9. Summary
AICommerce’s new serverless “Lamda” environment enables partners to develop and maintain their own microservices code seamlessly within AI Commerce core. This supports the Composable Commerce architecture, where each component is separate, yet securely and tightly connected to the core AI Commerce services:
- A fast VPC environment ensures low latency.
- A consistent domain structure ensures that from the end user's perspective, everything works as a unified system.
- The partner's own AWS CLI keys and GitHub repos provide flexibility and autonomy.
- Limited resources (API Gateway throttling, Lambda memory limits) ensure that the system remains within stable performance limits.
- The ability to “overwrite” or extend AICommerce's native features opens the door to implementing even complex integrations.
Thank you for reading this article. We hope it provides a clear understanding of how AI Commerce can be extended and customized in our serverless environment. If you need further assistance, you can contact AI Commerce support team directly or open a ticket directly in the partner repository GitHub project. Good luck on your Composable Commerce journey!