AWS – API Gateway to create data in DynamoDB through Lambda function in Java

By Nobukimi Sasaki (2023-04-24) Continued from API Gateway + Lambda + DynamoDB (Configuration)

Dependency for AWS Lambda

        <dependency>
            <groupId>com.amazonaws</groupId>
            <artifactId>aws-lambda-java-core</artifactId>
            <version>1.1.0</version>
        </dependency>
        <dependency>
            <groupId>com.amazonaws</groupId>
            <artifactId>aws-lambda-java-events</artifactId>
            <version>2.0.2</version>
        </dependency>
        <dependency>
            <groupId>com.amazonaws</groupId>
            <artifactId>aws-java-sdk-dynamodb</artifactId>
            <version>1.11.271</version>
        </dependency>
        <dependency>
            <groupId>com.fasterxml.jackson.core</groupId>
            <artifactId>jackson-databind</artifactId>
            <version>2.14.1</version>
        </dependency>

The main class receive the request:APIGatewayProxyRequestEvent::getHttpMethod.

import com.amazonaws.services.dynamodbv2.AmazonDynamoDB;
import com.amazonaws.services.dynamodbv2.AmazonDynamoDBClientBuilder;
import com.amazonaws.services.dynamodbv2.datamodeling.DynamoDBMapper;
import com.amazonaws.services.dynamodbv2.datamodeling.DynamoDBScanExpression;
import com.amazonaws.services.lambda.runtime.Context;
import com.amazonaws.services.lambda.runtime.RequestHandler;
import com.amazonaws.services.lambda.runtime.events.APIGatewayProxyRequestEvent;
import com.amazonaws.services.lambda.runtime.events.APIGatewayProxyResponseEvent;

public class LambdaHandler implements RequestHandler<APIGatewayProxyRequestEvent, APIGatewayProxyResponseEvent> {

    @Override
    public APIGatewayProxyResponseEvent handleRequest ( APIGatewayProxyRequestEvent apiGatewayRequest, Context context ) {

        EmployeeService employeeService = new EmployeeService();

        switch (apiGatewayRequest.getHttpMethod()) {

            case "POST":
                return employeeService.saveEmployee( apiGatewayRequest, context );

            case "GET":
                if (apiGatewayRequest.getPathParameters() != null) {
                    return employeeService.getEmployeeById( apiGatewayRequest, context );
                }
                return employeeService.getEmployees( apiGatewayRequest, context );
            case "DELETE":
                if (apiGatewayRequest.getPathParameters() != null) {
                    return employeeService.deleteEmployeeById( apiGatewayRequest, context );
                }
            default:
                throw new Error( "Unsupported Methods:::" + apiGatewayRequest.getHttpMethod() );

        }
    }
}

The 1st parameter, APIGatewayProxyRequestEvent contains:
This getHttpMethod returns String httpMethod that is “GET”, “POST”, “PUT”,,,,

The 2nd parameter, Context contains:

The response class “APIGatewayProxyResponseEvent” contains:

Saving data:

import com.amazonaws.services.dynamodbv2.AmazonDynamoDB;
import com.amazonaws.services.dynamodbv2.AmazonDynamoDBClientBuilder;
import com.amazonaws.services.dynamodbv2.datamodeling.DynamoDBMapper;
import com.amazonaws.services.dynamodbv2.datamodeling.DynamoDBScanExpression;
import com.amazonaws.services.lambda.runtime.Context;
import com.amazonaws.services.lambda.runtime.events.APIGatewayProxyRequestEvent;
import com.amazonaws.services.lambda.runtime.events.APIGatewayProxyResponseEvent;
import com.app.easy2excel.entity.Employee;

import java.util.List;
import java.util.Map;

public class EmployeeService {

    private DynamoDBMapper dynamoDBMapper;
    private static String jsonBody = null;

    public APIGatewayProxyResponseEvent saveEmployee ( APIGatewayProxyRequestEvent apiGatewayRequest, Context context ) {
        initDynamoDB();
        Employee employee = Utility.convertStringToObj( apiGatewayRequest.getBody(), context );
        dynamoDBMapper.save( employee );
        jsonBody = Utility.convertObjToString( employee, context );
        context.getLogger().log( "data saved successfully to dynamodb:::" + jsonBody );
        return createAPIResponse( jsonBody, 201, Utility.createHeaders() );
    }
....

Within the createAPIResponse, it returns the APIGatewayProxyResponseEvent:

    private APIGatewayProxyResponseEvent createAPIResponse ( String body, int statusCode, Map<String, String> headers ) {
        APIGatewayProxyResponseEvent responseEvent = new APIGatewayProxyResponseEvent();
        responseEvent.setBody( body );
        responseEvent.setHeaders( headers );
        responseEvent.setStatusCode( statusCode );
        return responseEvent;
    }

The entity model:

import com.amazonaws.services.dynamodbv2.datamodeling.DynamoDBAttribute;
import com.amazonaws.services.dynamodbv2.datamodeling.DynamoDBHashKey;
import com.amazonaws.services.dynamodbv2.datamodeling.DynamoDBTable;

@DynamoDBTable(tableName = "employee")
public class Employee {

    @DynamoDBHashKey(attributeName = "empId")
    private String empId;

    @DynamoDBAttribute(attributeName = "name")
    private String name;

    @DynamoDBAttribute(attributeName = "email")
    private String email;
....

How to deploy the code:
Go to intellij terminal, as command line >mvn clean install

The jar file is created under target:

After specifying the built jar file, then upload.
Lambda > Functions > RestApiLambda (your function name) … There is a “Upload From” button:

Scroll down and under Runtime settings > Edit

Edit the header info as:

1. Package path
2. Class name
3. Method name
Then Save! Now the API is ready to run

In Postman, the POST request went succesfully:


Go to dashboard DynamoDB > Tables

1. Tables
2. Check
3. Click the table name
4. Explore the items
5. You will see the data created

AWS – API Gateway + Lambda + DynamoDB (Configuration)

By Nobukimi Sasaki (2023-04-24)

Create Dynamo DB

In AWS management console, pick Dynamo DB. In this example, the table name is “employee”. Give the primary key name “empId” in String.

Create Lambda Function:

In AWS management console, pick Lambda, then “Create function”. In this example, “RestApiLambda”.
The Runtime, “Java 11”, the table name is “employee”. Give the primary key name “empId” in String.

Providing a role:

Go to “IMA” console, choose “Lambda”, set the permission to “AmazonDynamoDBFullAccess”, “AWSLambdaBasicExecutionRole”.
Give the role name. In this example, “crudAPIRole”. After creating the role, check “use existing role”, and “Create function”.

Create API Gateway:

In AWS management console, pick “API Gateway”. Choose protocol “REST” and “New API”. In this example, API name is “EmployeeAPI”.

Specify the “Resources” for the API:

Under “API:EmployeeAPI”, click “Resource” then set the name “employee” it make the path …/employee as URL.
Check Enable API Gateway CORS.


1. Click the name of the resource (“employee” in this example)
2. Create Method
3. Select “POST” depends in the type.
4. Check this
5. Lambda Function
6. Use Lambda Proxy Integration
7. Type the name of function you have created (“RestApiLambda” in this example)
8. Save


1. Action
2. Deploy API
3. New Stage
4. dev (environment)
5. Deploy


After deployed, this is the URL for the API.

Microservices with Kafka

By Nobukimi Sasaki (2023-04-23)

Introduction:
This is my study note composed of three services (Spring Boot) with a message broker (kafka).
There services are independent to each other and communicating asynchronously.

Entire project structure:
There are four Spring Boots project under one folder

Service A: order-service
Whenever receiving the HTTP request, the order-service create a event and publish the event to the message broker (Kafka).

Service B (stock-service) & C (email-service)
Just consume the message from the message broker (Kafka) and Service B & C don’t know about Service A (order-service). They are all independent to each other.

Spring Boot for DTO object (base-domains)
The DTO classes for the message that being referred from Service A, B, and C.

Service A – order-service detail:

1. OrderController::placeOrder receives the request from HTTP client
2. OrderController::placeOrder::sendMessage calls Kakfa Producer
3. OrderProducer::sendMessage calls KafkaTemplate::send

spring.kafka.producer.bootstrap-servers: localhost:9092
spring.kafka.producer.key-serializer: org.apache.kafka.common.serialization.StringSerializer
spring.kafka.producer.value-serializer: org.springframework.kafka.support.serializer.JsonSerializer
spring.kafka.topic.name=order_topics

At Service A, this property configures a Kafka Server (line 1), and topic name (line 4)

		<dependency>
			<groupId>org.springframework.boot</groupId>
			<artifactId>spring-boot-starter-web</artifactId>
		</dependency>
		<dependency>
			<groupId>org.springframework.kafka</groupId>
			<artifactId>spring-kafka</artifactId>
		</dependency>
		<dependency>
			<groupId>net.javaguides</groupId>
			<artifactId>base-domains</artifactId>
			<version>0.0.1-SNAPSHOT</version>
		</dependency>

		<dependency>
			<groupId>org.springframework.boot</groupId>
			<artifactId>spring-boot-starter-test</artifactId>
			<scope>test</scope>
		</dependency>
		<dependency>
			<groupId>org.springframework.kafka</groupId>
			<artifactId>spring-kafka-test</artifactId>
			<scope>test</scope>
		</dependency>

At Service A, B, and C, the dependency is Kakfa and DTO project (base-domains)

import net.javaguides.basedomains.dto.Order;
import net.javaguides.basedomains.dto.OrderEvent;
import net.javaguides.orderservice.kafka.OrderProducer;
import org.springframework.web.bind.annotation.PostMapping;
import org.springframework.web.bind.annotation.RequestBody;
import org.springframework.web.bind.annotation.RequestMapping;
import org.springframework.web.bind.annotation.RestController;

import java.util.UUID;

@RestController
@RequestMapping("/api/v1")
public class OrderController {

    private OrderProducer orderProducer;

    public OrderController(OrderProducer orderProducer) {
        this.orderProducer = orderProducer;
    }

    // Rest End Point API for testing
    @PostMapping("/orders")
    public String placeOrder(@RequestBody Order order){

        order.setOrderId(UUID.randomUUID().toString());

        OrderEvent orderEvent = new OrderEvent();
        orderEvent.setStatus("PENDING");
        orderEvent.setMessage("order status is in pending state");
        orderEvent.setOrder(order);

        orderProducer.sendMessage(orderEvent);

        return "Order placed successfully ...";
    }
}

At Service A, OrderController::placeOrder receives the request from HTTP client, and sendMessage calls Kakfa Producer

import net.javaguides.basedomains.dto.OrderEvent;

import org.apache.kafka.clients.admin.NewTopic;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.kafka.core.KafkaTemplate;
import org.springframework.kafka.support.KafkaHeaders;
import org.springframework.messaging.Message;
import org.springframework.messaging.support.MessageBuilder;
import org.springframework.stereotype.Service;

@Service
public class OrderProducer {

    private static final Logger LOGGER = LoggerFactory.getLogger(OrderProducer.class);

    private NewTopic topic;

    private KafkaTemplate<String, OrderEvent> kafkaTemplate;

    // Parameter constructor in this class
    public OrderProducer(NewTopic topic, KafkaTemplate<String, OrderEvent> kafkaTemplate) {
        this.topic = topic;
        this.kafkaTemplate = kafkaTemplate;
    }

    public void sendMessage(OrderEvent event){
        LOGGER.info(String.format("Order event => %s", event.toString()));

        // create Message
        Message<OrderEvent> message = MessageBuilder
                .withPayload(event)
                .setHeader(KafkaHeaders.TOPIC, topic.name())
                .build();
        kafkaTemplate.send(message);
    }
}

At Service A, OrderProducer::sendMessage calls KafkaTemplate::send

Service B (stock-service) and C (email-service) Service detail:

Service B (stock-service) and C (email-service) has kafka consumer to receive the message.

server.port=8081
spring.kafka.consumer.bootstrap-servers: localhost:9092
spring.kafka.consumer.group-id: stock
spring.kafka.consumer.auto-offset-reset: earliest
spring.kafka.consumer.key-deserializer: org.apache.kafka.common.serialization.StringDeserializer
spring.kafka.consumer.value-deserializer: org.springframework.kafka.support.serializer.JsonDeserializer
spring.kafka.consumer.properties.spring.json.trusted.packages=*
spring.kafka.topic.name=order_topics

At Service B (stock-service), Line 1 – the port is 8081, and group id is “stock”

server.port=8082
spring.kafka.consumer.bootstrap-servers: localhost:9092
spring.kafka.consumer.group-id: email
spring.kafka.consumer.auto-offset-reset: earliest
spring.kafka.consumer.key-deserializer: org.apache.kafka.common.serialization.StringDeserializer
spring.kafka.consumer.value-deserializer: org.springframework.kafka.support.serializer.JsonDeserializer
spring.kafka.consumer.properties.spring.json.trusted.packages=*
spring.kafka.topic.name=order_topics

At Service C (email-service), Line 1 – the port is 8082, and group id is “email”
Both Service B & C, Line 2 – register the kafka server port 9092, and line 8, the topic name of “order_topics”

import net.javaguides.basedomains.dto.OrderEvent;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.kafka.annotation.KafkaListener;
import org.springframework.stereotype.Service;

@Service
public class OrderConsumer {

    private static final Logger LOGGER = LoggerFactory.getLogger(OrderConsumer.class);

    @KafkaListener(
            topics = "${spring.kafka.topic.name}"
            ,groupId = "${spring.kafka.consumer.group-id}"
    )
    public void consume(OrderEvent event){
        LOGGER.info(String.format("Order event received in stock service => %s", event.toString()));

        // business logic
    }
}

Both Service B & C have this consumer class.

DTO object (base-domains) Details:

This DTO classes are referenced from Service A, B, and C


Example of RESTFul client