Saturday, October 5, 2019

Decorator Design Pattern In Java

Greetings!

Decorator desgin pattern is a structural design pattern.
This is a pretty standards pattern in Java, especially in code related to input/output classes such as FileReader, BufferedReader.

Full source code for this blog post is at my github account. [source-code]

GoF definition.
"Attach additional responsibilities to an object dynamically. Decorators provide a flexible alternative to sub classing for extending functionality."



This is very powerful pattern to add extra funtionality to an object through composition. Interestingly, we are doing it at runtime.
This gives us an alternative to subclassing.

"Subclassing adds behaviour at compile time, decorator adds behaviour at run time."

When we use inheritance to add additional functionality, we may end with long heirarchy with too many classes. With decorator design pattern, we only need to create base and additional functionality can be added by decorating that base.
This is sometimes called as wrapper. Term used by Josua Bloch in his famous book is Forwarder.

"Inheritance is one form of extension, but not necessarily the best way to achieve flexibility in our designs."

Steps

  • Create base interface.
  • Implement it to add core functionality.
  • Create abstract class by implementing the interface which also compose the same interface.
  • Extend abstract class to add extra functionality.

Working Example

Let's decorate what is above us. The Sky! Sky mainly has two forms. Noon and night. Depending on days condition Sky is decorated with clouds, stars.

We can start with our Sky.

package com.slmanju.patterns;

public interface Sky {

    void draw();

}


Night sky have dark behaviour.

package com.slmanju.patterns;

public class NightSky implements Sky {

    @Override
    public void draw() {
        System.out.println("Drawing sky with black color");
    }

}


Now we can have our AbstractSky which acts as the base decorator.

package com.slmanju.patterns;

public abstract class AbstractSky implements Sky {

    protected Sky sky;

    public AbstractSky(Sky sky) {
        this.sky = sky;
    }

}


Let's add some stars into the Sky.

package com.slmanju.patterns;

public class StarSky extends AbstractSky {

    public StarSky(Sky sky) {
        super(sky);
    }

    @Override
    public void draw() {
        sky.draw();
        System.out.println("Drawing stars");
    }

}


Now it is time to decorate our Sky.

package com.slmanju.patterns;

public class App  {

    public static void main(String[] args) {
        Sky night = new NightSky();
        night = new StarSky(night);
        night = new CloudySky(night);
        night.draw();

        System.out.println("---------");

        Sky sky = new RainySky(new CloudySky(new NoonSky()));
        sky.draw();
    }

}


Remember;
  • Classes should be open for extension but closed for modification.
  • You can extend your core functionality by decorating it.

Happy coding :)

Friday, October 4, 2019

Template Method Design Pattern In Java

Greetings!

Template Method pattern is a behavioral pattern.
We put our businesss logic in abstract in a super class and let the sub classes override specific steps without changing the original structure.

Source code for this post can be found in my github account. [source-code]

When we have a common steps to do something but some steps vary, we can do it using bunch of if else statements. But the problem is things are tightly coupled and when new things need to be added we have to change our class. This leads our class difficult to main. For these kind of problems Template Method can be used.

Definition of Gof;
"Defines the skeleton of an algorithm in a method, deferring some steps to sub-classes. Template Method lets sub-classes redefine certain steps of an algorithm without changing the algorithm's structure."


"Don't call us, we'll call you. - allow low-level components to hook themselves into a system, but the high-level components determine when they are need and how."

Template method has easy two steps.
  • Define algorithm steps in super class.
  • Implement specific steps in sub class.

Working Example

Let's say we want to generate document Pdf, Excell, etc. Each file can have different ways to generate the document but loading data from the database, convert document into stream, etc are commmon operations. But steps are same. We add those common steps into super class and let the sub class define the what varies.

We can start by creating super class. In our case it is DocumentTemplate. Note that template method is final.

package com.slmanju.patterns;

public abstract class DocumentTemplate {

    public final void getDocument() {
        System.out.println("Load information...");

        generate();

        System.out.println("Finalize document creation...");
    }

    protected abstract void generate();

}


Now we can extend it to create our concrete implementation.

package com.slmanju.patterns;

public class PdfDocumentTemplate extends DocumentTemplate {

    @Override
    protected void generate() {
        System.out.println("Generating pdf document.");
    }

}


Let's run our application.

package com.slmanju.patterns;

public class App {

    public static void main(String[] args) {
        DocumentTemplate documentTemplate = new PdfDocumentTemplate();
        documentTemplate.getDocument();

        System.out.println("----------");

        documentTemplate = new ExcelDocumentTemplate();
        documentTemplate.getDocument();
    }

}


Things to note;

  • Algorithm steps are fixed but some implementations may vary. Subclasses are responsible to override necessary steps.
  • Subclass doesn't call super class. Instead it is super class which controls the flow. This is called 'Hollywoord principle'.
  • What if a step is based on a condition? We can add if condition with a method which may implement by sub classes. This is called a Hook. Super class may provide the default condition.
  • Template method and Factory method patterns are look similar. But the intent is different. Factory method is for creating objects. Template method is to define behaviour.
  • Strategy pattern uses composition while Template method uses inheritance.

Happy coding :)

Strategy Design Pattern

Greetings!

Strategy pattern is a behavioral pattern. It is one of the easiest patterns to learn and one of the most used pattern.

Source code for this blog post can be found in my github account. [source-code]

Let's see the GoF definition;
"Define a family of algorithms, encapsulate each one, and make them interchangeable. Strategy lets the algorithm vary independently from clients that use it."


"Algorithm" in this definition is not necessarily a mathematical algorithm. It is something that do a thing. Basically algorithm is just another name for a strategy.
This leads us to;

  • A class should be configured with an algorithm instead of implementing an algorithm directly.
  • An algorithm should be selected an exchanged at run time.

When we start OOP, we are fond of inheritance. But when it comes to separating the code into more manageable way, we choose composition over inheritance.
"Consider 'has-a' instead of 'is-a'"

We achieve this by extracting the volatile parts of our code and encapsulate them as objects.
"Separate the parts of the code that will change the most."

Working Example

Let's take a practical example. We need to have file storage service which can handle save, retrieve operations. There, we can have a class and simply put the saving logic into that class. May be, from the begining it is not clear. Let's say we choose to save files in local storage. But what if we want to save files in AWS? It is clear that we should manage it in separate class. We can start with local file storage and easily swith to AWS.

We start by creating our FileService

package com.slmanju.patterns;

public interface FileService {

    void save();

    void retrieve();

}


Let's create the LocalFileService

package com.slmanju.patterns;

public class LocalFileService implements FileService {

    @Override
    public void save() {
        System.out.println("Save file in local disk.");
    }

    @Override
    public void retrieve() {
        System.out.println("Retrieving file from local disk.");
    }

}


We are going to use this in our StorgeService

package com.slmanju.patterns;

public class StorageService {

    private FileService fileService;

    public StorageService(FileService fileService) {
        this.fileService = fileService;
    }

    public void setFileService(FileService fileService) {
        this.fileService = fileService;
    }
    
    public void save() {
        fileService.save();
    }

    public void retrieve() {
        fileService.retrieve();
    }

}


All set. Let's run our application.

package com.slmanju.patterns;

public class App {

    public static void main(String[] args) {
        FileService localFileService = new LocalFileService();

        StorageService storageService = new StorageService(localFileService);
        storageService.save();
        storageService.retrieve();

        System.out.println("------");

        FileService awsFileService = new AwsFileService();
        storageService.setFileService(awsFileService);
        storageService.save();
        storageService.retrieve();
    }

}



Remember;
  • Encapsulate what varies.
  • Favor composition over inheritance.
  • Program to interface, not implementation.

Happy coding!


Factory Method Design Pattern

Greetings!

Factory method design pattern is a creational design pattern which deals with creating objects.
Many misunderstand this pattern with simple factory helper class.

Source code for this post can be found in my github account. [source-code]

This is the definition given by GoF.
"Define an interface for creating an object, but let subclasses decide which class to instantiate. Factory method lets a class defer instantiation to subclasses."

As in original GoF design pattern, we define an interface (abstract class) to create objects that lets the sub-class to decide how to create the object.


Let's see the actors in this design pattern.
  • Product - This is the final object we need to create. We are dealing only with abstraction here because we can have multiple types of products.
  • Factory - A class to create Product. But it doesn't directly create object. Instead it provides abstract method to create the Product. Sub classes of the Factory needs to implement that creation method.
  • ConcreteFactory - This class is responsible to create the actual Product.
  • Client - This class needs to use a Product. It uses correct Factory to construct the Product.

Working Example

To explain this, i'm going to create a hypothetical shooting game with trail and purchased versions. A player can select a weapon. How do you create TrailWeapon or PurchasedWeapon? Are you going to use multiple if else in everywhere? We do not need to do that. We can create separate factories for 2 versions.
This example needs multiple classes. Here i'll add only neccesary classes. You can get the full code from my github repository.

First, we create our Product interface which is Weapon.

package com.slmanju.patterns;

public interface Weapon {

    void fire();

    void load();

}


We can implement this to create a concrete product. Here it is Rifle.

package com.slmanju.patterns;

public class Rifle implements Weapon {

    @Override
    public void fire() {
        System.out.println("Firing with rifle.");
    }

    @Override
    public void load() {
        System.out.println("Loading bullets into rifle.");
    }

}


Then we can create our Factory, WeaponStore. Here createWeapon is our factory method.

package com.slmanju.patterns;

// factory
public abstract class WeaponStore {

    public final Weapon purchase(WeaponType weaponType) {
        Weapon weapon = createWeapon(weaponType);
        weapon.load();
        return weapon;
    }

    // factory method
    protected abstract Weapon createWeapon(WeaponType weaponType);

}


Anddd, this is jungle shooter game. Hence our concrete factory is JungleWeaponStore.

package com.slmanju.patterns;

public class JungleWeaponStore extends WeaponStore {

    @Override
    protected Weapon createWeapon(WeaponType weaponType) {
        switch (weaponType) {
            case RIFLE:
                return new Rifle();
            case SHOTGUN:
                return new Shotgun();
            default:
                return new NullWeapon();
        }
    }

}


All set. Now we can have a Warrior.

package com.slmanju.patterns;

public class Warrior {

    private Weapon weapon;

    public Warrior(Weapon weapon) {
        this.weapon = weapon;
    }

    public void fight() {
        weapon.fire();
    }

    public void setWeapon(Weapon weapon) {
        this.weapon = weapon;
    }

}


It is time to let our Warrior to go insdie jungle to defeat enemies.

package com.slmanju.patterns;

public class ShooterGame {

    public static void main(String[] args) {
        WeaponStore weaponStore = new JungleWeaponStore();
        Weapon weapon = weaponStore.purchase(WeaponType.RIFLE);
        Warrior warrior = new Warrior(weapon);
        warrior.fight();

        weapon = weaponStore.purchase(WeaponType.SHOTGUN);
        warrior.setWeapon(weapon);
        warrior.fight();
    }

}


I hope you get the idea of factory method design pattern.

Happy coding.


Sunday, September 29, 2019

Builder Design Pattern In Java

Greetings!

Builder design pattern is a creational design pattern which helps us creating objects.

According to GoF, Builder design pattern;
"Separate the construction of a complex object from its representation so that the same construction processes can create different representations."

Eventhough the original pattern description talked about complex object creation, most used scenario is to help constructing objects fluently.
Builder design pattern helps us to assemble objects part by parts hiding inner states.

In this post i'm going to talk only about normal object creation using this pattern.



As I said earlier, this is a convienient way to construct objects with many parameters. Let's say you have a class which needs 10 parameters. Are you going to create a constructor with 10 arguments? Which is very difficult to maintain and use. This is where i'm giong to use Builder design pattern.
Help to create objects with multiple parametes in simple way.
  • Create immutable objects.
  • Encapsulates code for construction and representation.

Disadvatanges

Just like any other design, this also has few disadvantages.

  • Requires to create a separate Builder class with same number of variables.
  • For separate objects, it will need to maintain separate builders as well.

Main Class

This is the real object we need to construct. We are not exposing a constructor for this. Even we do not give any setter method.

Builder

Holds the properties of the main class. Most of the time, we use this as an inner class. We provide setter methods for all the parameters (with fancy name) and return Builder it self for furthur processing. Once the client call the build method, it creates the object of the main class.

Working Example

Let's assume that we are going to create game and we need to initialize our player object. Player may have many properties which is too big for a constructor. (though i'm not going to use many for this demo.)


package com.slmanju.patterns;

public class Player {

    private final String alias;
    private final String weapon;
    private final String hair;

    private Player(Builder builder) {
        this.alias = builder.alias;
        this.weapon = builder.weapon;
        this.hair = builder.hair;
    }

    public String getAlias() {
        return alias;
    }

    public String getWeapon() {
        return weapon;
    }

    public String getHair() {
        return hair;
    }

    @Override
    public String toString() {
        return "Player{" +
                "alias='" + alias + '\'' +
                ", weapon='" + weapon + '\'' +
                ", hair='" + hair + '\'' +
                '}';
    }

    public static class Builder {

        private final String alias;
        private String weapon;
        private String hair;

        public Builder(String alias) {
            this.alias = alias;
        }

        public Builder withWeapon(String weapon) {
            this.weapon = weapon;
            return this;
        }

        public Builder withHair(String hair) {
            this.hair = hair;
            return this;
        }

        public Player build() {
            return new Player(this);
        }
    }

}


package com.slmanju.patterns;

public class App {

    public static void main(String[] args) {
        Player player = new Player.Builder("Nero").withWeapon("Sniper").withHair("curly").build();

        System.out.println(player);
    }

}


This is not the end of the Builder design pattern. Where, I didn't cover complex scenarios like inheritance which needs a separate blog post. Though, I hope this will help you in your daily development tasks.

Happy coding :)


Saturday, September 28, 2019

Observer Design Pattern In Java

Greetings!

Greetings!

Observer design pattern falls into behavioral category.

Observer design pattern is useful when we have multiple objects doing things according to another objects changes.
This main object is callsed Subject and depending objects are called Observers.

  • Subject and observers are loosely coupled and have no knowledge of each other.
Subject(one) ----> Observer(many)

Swing event listner, Spring listner are some examples for Observer desgin pattern.

Observer design pattern according to GoF,
"Define a one-to-many dependency between objects so that when one object changes state, all its dependencies are notified and updated automatically."



Subject

Maintain a list of observers who are interested in getting notified when something useful is happend.
This provides methods to register the Observer, unregister the Observer and a method to notify all Observers at once.

Observer

Object which is interested in Subject's changes. Has a method to get updates from the Subject.

Why not use built in Observer?

Java util library provides a built in Observer pattern which we can use. Unfortunately it has few drawbacks hence not widely used.
Observable is class which we have to extend limiting us extending another class.
Use a Vector to maintain Observers which is synchronized and outdated.
Methods are synchronized.

This is very easy pattern to implement. So we should not depend on the in-built pattern.

Working Example

For our example i'm going to use an ordering system which user place an order. After completing the order user may get an email also the print out of the order.
Here, order is the Subject, email and print are Observering the order.

Let's start defining our Observable interface.

package com.slmanju.patterns;

public interface Observable {

    void attachObserver(Observer observer);

    void removeObserver(Observer observer);

    void sendNotification();

}


Now we need Observer interface.

package com.slmanju.patterns;

public interface Observer {

    void update(Order order);

}


Then we need a concrete Subject which is our Order class.

package com.slmanju.patterns;

import java.util.ArrayList;
import java.util.List;

public class Order implements Observable {

    private List observers;

    public Order() {
        this.observers = new ArrayList<>();
    }

    public void complete() {
        System.out.println("Completing the order");
        sendNotification();
    }

    @Override
    public void attachObserver(Observer observer) {
        observers.add(observer);
    }

    @Override
    public void removeObserver(Observer observer) {
        observers.remove(observer);
    }

    @Override
    public void sendNotification() {
        observers.forEach(observer -> observer.update(this));
    }

}


Now we can implement our Observers.

package com.slmanju.patterns;

public class OrderEmail implements Observer {

    @Override
    public void update(Order order) {
        System.out.println("Sending order email");
    }

}

package com.slmanju.patterns;

public class OrderPrinter implements Observer {

    @Override
    public void update(Order order) {
        System.out.println("Printing order");
    }

}


All set. Let's try our application by creating an order.

package com.slmanju.patterns;

public class App {

    public static void main(String[] args) {
        Order order = new Order();
        order.attachObserver(new OrderPrinter());
        order.attachObserver(new OrderEmail());

        order.complete();
    }

}


That is all for Observer design pattern. Remember to;
  • Strive for loosely coupled designs between objects that interact.

Tuesday, July 23, 2019

How to dockerize a spring boot application

Greetings!

Spring boot helps us to create application very quickly. Docker provides us a way to "build, ship and run" our applications. In a world of Microservices, combining these two gives us powerful ways to create and distribute our Java applications.

I assume you have enough Spring boot and Docker knowledge and want study further to dockerize Spring boot applications.

https://github.com/slmanju/springtime/tree/master/spring-boot-docker

Let's create a simple rest service and dockerize it.

Step 1: Create Spring boot application

Go to https://start.spring.io and fill it as you want. Then add Spring Web Starter as a dependency. This is enough for us create a simple rest service. Download and extract the application. Then add below controller (or anything you like).
package com.slmanju.springbootdocker;

import org.springframework.web.bind.annotation.GetMapping;
import org.springframework.web.bind.annotation.RestController;

@RestController
public class HomeController {

    @GetMapping(value = { "", "/"})
    public String index() {
        return "Welcome to spring boot docker integration";
    }

    @GetMapping(value = "/hello")
    public String hello() {
        return "Hello world";
    }

}

Update the application.properties with port number.
server.port = 8081
Create another property file named application-container.properties and update it as below. This is to demonstrate Spring profile use in docker container.
server.port = 8080

You can test the application by runninig;
mvn clean spring-boot:run
curl http://localhost:8081/

Similarly we can run the jar file separately. This is what we need in our dockerfile.
mvn clean package
java -jar target/spring-boot-docker-0.0.1-SNAPSHOT.jar

Step 2: Create Dockerfile

I have used Maven as my build tool. Maven uses target/ path as the build location. For Gradle it is build/libs.
FROM openjdk:8-jdk-alpine
ADD target/spring-boot-docker-0.0.1-SNAPSHOT.jar app.jar
EXPOSE 8080
ENTRYPOINT ["java", "-jar", "-Dspring.profiles.active=container", "app.jar"]

What is this file doing?

  • Get openjdk alpine image.
  • Add our spring boot application as app.jar
  • Expose 8080 port (this is port we used in application-container.properties)
  • Entrypoint for the application. I'm passing spring profile to match the exposed port.

Step 3: Create a docker image using our Dockerfile

To create the docker image we need to create jar file first.
mvn clean package
Now we can create the docker image using that jar file.
docker build -f Dockerfile -t hello-docker .


Step 4: Verify image

Verify whether your image is created.
docker image ls

Step 5: Create a container

Our docker image is created. We can create a container using it now.
docker container run -p 8082:8080 hello-docker

Step 6: Test it

Go to web browser and go to localhost.
curl http://localhost:8082/
curl http://localhost:8082/hello

Great! We have containerized our spring boot application.

There is more. We can improve this.

Uppack the generated fat jar into target/extracted (or your desired location).
mkdir target/extracted
cd target/extracted
jar -xf ../*.jar

As you can see, Spring boot fat jar is packaged as layers to separate external dependencies and classes. As we know docker containers are alos layered. We can use thsi to improve our containers.
BOOT-INF/lib
BOOT-INF/classes
META-INF
org

External dependencies will not change often. So, we can add in as the first layer. We can add META-INF into another layer. We add classes as the last layer since it will change over time. Why this is important?

Docker will cache layers. Since our lib (and META-INF) layers do not change very often our build will be faster. Also when we use dockerhub to push and pull our image, that will also faster since we only need to pull our classes layer.
Also in the Dockerfile, I have hard coded the main class boost the start up.
FROM openjdk:8-jdk-alpine
VOLUME /tmp
ARG APP=target/extracted
COPY ${APP}/BOOT-INF/lib /app/lib
COPY ${APP}/META-INF /app/META-INF
COPY ${APP}/BOOT-INF/classes /app
ENTRYPOINT ["java", "-cp", "app:app/lib/*", "-Dspring.profiles.active=container", "com.slmanju.springbootdocker.SpringBootDockerApplication"]

Now you can create a container (docker container run -p 8082:8080 hello-docker) and test it.

Conclusion

In this article I have discussed how to containerized a spring boot application. There are Maven and Gradle plugin to help you on this. But creating containers in this way is better for learning. If you are like me, want to explore docker (and spring boot) this will help you to get started. So what are waiting for? go and create your container!

References

https://spring.io/guides/gs/spring-boot-docker/
https://spring.io/guides/topicals/spring-boot-docker

Monday, June 24, 2019

How to use Spring Boot with MySQL database

Greetings!

Spring Framework simplifies working with databases by auto configuring connections, handling transactions, using ORM tool like hibernate, abstract sql by Spring Data Repository. We are going to focus on how to connect MySQL database with Spring Boot application.

Sprint Boot has many defaults. For databases, H2 in-memory database is the default database. It auto-configures in-memory databases even without connection url. Those are good for simple testing. For production use we need to use a database like MySQL.

Spring Boot selects HickariCP Datasource due to it is performance. When spring-boot-starter-data-jpa dependency in classpath it automatically pick HickariCP.

complete source code this blog post is here.

How to configure a database

Obviously, to use a database in our application we need;
  • Database driver to connect to database
  • Connection url
  • Database username and password
In Spring Boot application we need to provide atleast connection url, otherwise it will try to configure in-memory database. Using connection url it can deduce the database driver to be used. So we do not need to configure database driver.

To configure above properties, Spring externalize configuration properties using spring.datasource.*.
spring.datasource.url = jdbc:mysql://localhost/test
spring.datasource.username = dbuser
spring.datasource.password = dbpassword
spring.datasource.driver-class-name = com.mysql.jdbc.Driver // no need

If we need more fine tuning we can use other configuration properties like spring.datasource.hikari.*.

How to auto-create a database

If we like to let the application create the database for us, we can use spring.jpa.hibernate.ddl-auto property. This value is none for MySQL and create-drop for embedded databases.
spring.jpa.hibernate.ddl-auto = create

Additionally if schema.sql (DDL) and data.sql (DML) files are in resouces folder Spring Boot can pick those and populate database. We can change default location by using schema and data properties.
spring.datasource.initialization-mode = always
spring.datasource.schema = classpath:/database/schema.sql # Schema (DDL) script resource references.
spring.datasource.data = classpath:/database/data.sql # Data (DML) script resource references.

Using above knowledge let's create a simple application which connects to MySQL database.

Create our database

I like to create database separately. Connect to MySQL database and create our database.
> mysql -uroot -proot
> create database book_store;
> use book_store;
// use schema.sql and data.sql to populate database

Create the project

Go to https://start.spring.io and select spring-boot-starter-data-jpa, spring-boot-starter-web, mysql-connector-java and lombok dependencies.
pom.xml will be look like this.
<dependency>
    <groupId>org.springframework.boot</groupId>
    <artifactId>spring-boot-starter-data-jpa</artifactId>
</dependency>

<dependency>
    <groupId>org.springframework.boot</groupId>
    <artifactId>spring-boot-starter-web</artifactId>
</dependency>

<dependency>
    <groupId>mysql</groupId>
    <artifactId>mysql-connector-java</artifactId>
    <scope>runtime</scope>
</dependency>

<dependency>
    <groupId>org.projectlombok</groupId>
    <artifactId>lombok</artifactId>
    <optional>true</optional>
</dependency>

Configure the database

Create applicatin.yml and add below configuration properties.
spring:
  jpa:
    hibernate:
      ddl-auto: validate
  datasource:
    url: jdbc:mysql://localhost:3306/book_store
    username: root
    password: root

This is all we need to connect to MySQL database. Let's create our domain object and repository.
@Data
@Entity
public class Book implements Serializable {

    @Id
    @GeneratedValue(strategy = GenerationType.IDENTITY)
    private Integer id;
    private String title;
    private String author;

}

public interface BookRepository extends JpaRepository<Book, Integer> {
}

Service layer

@Service
@Transactional
public class BookServiceImpl implements BookService {

    private final BookRepository bookRepository;

    @Autowired
    public BookServiceImpl(BookRepository bookRepository) {
        this.bookRepository = bookRepository;
    }

    @Override
    public List<Book> findAll() {
        return bookRepository.findAll();
    }

    @Override
    public Book findById(Integer id) {
        return bookRepository.findById(id).orElse(null);
    }

    @Override
    public Book save(Book book) {
        return bookRepository.save(book);
    }

    @Override
    public void delete(Integer id) {
        bookRepository.deleteById(id);
    }

    @Override
    public Book update(Book book) {
        return bookRepository.save(book);
    }

}

Rest controller layer

@RestController
public class BookController {

    private final BookService bookService;

    @Autowired
    public BookController(BookService bookService) {
        this.bookService = bookService;
    }

    @GetMapping(value = "")
    public List<Book> findAll() {
        return bookService.findAll();
    }

    @GetMapping(value = "/{id}")
    public Book findById(@PathVariable Integer id) {
        return bookService.findById(id);
    }

    @PostMapping
    public Book save(@RequestBody Book book) {
        return bookService.save(book);
    }

    @DeleteMapping(value = "/{id}")
    public void delete(@PathVariable Integer id) {
        bookService.delete(id);
    }

    @PutMapping
    public Book update(@RequestBody Book book) {
        return bookService.save(book);
    }

}

Now start the application and try below cURL commands.
curl -X GET http://localhost:7070/

curl -X POST \
  http://localhost:7070/ \
  -H 'Content-Type: application/json' \
  -d '{
    "title": "Java Persistence with Hibernate",
    "author": "Gavin King"
}'

curl -X GET http://localhost:7070/1

That is the basics you want know when working with relational database with Spring Boot. You can use below references for further study.

References

https://spring.io/guides/gs/accessing-data-mysql/
https://docs.spring.io/spring-boot/docs/current/reference/htmlsingle/#boot-features-sql
org.springframework.boot.autoconfigure.orm.jpa.HibernateProperties

Sunday, June 23, 2019

Microservices - Distributed tracing with Spring Cloud Sleuth and Zipkin

Greetings!

Microservices are very flexible. We can have multiple microservices for each domain and interact as necessary. But it comes with a price. It becomes very complex when the number of microservices grow.
Imagine a situation where you found a bug or slowness in the system. How do you find the root cause by examinig logs?

  • Collect all the logs from related microservices.
  • Pick the starting microservice and find a clue there using some id (userid, businessid, etc).
  • Pick the next microservice and check whether the previous information are there.
  • Keep going until you find which microservice has the bug.

I have followed that practise in one of my previous projects. It is very difficult and takes a lot of time to track an issue.
This is why we need to use distributed tracing in microservices. One place where we can go and see the entire trace.

It helps us by;
  • Asign unique id (correlation id) to all request.
  • Pass unique id across all the microservices automatically.
  • Record time information.
  • Log service name, unique id, span id.
  • Aggregate log data from multiple microservices into single source.

Spring Cloud Sleuth

Spring Cloud Sleuth implements a distributed tracing solution for Spring Cloud. We can capture data simply using logs or send data to a collector service like Zipkin. 
Just by adding the library into our project Spring Cloud Sleuth can;
  • Add correlation id to all request if it doesn't exist.
  • Pass the id with outbound call.
  • Add correlation information to Spring's Mapped Diagnostic Context (MDC) which internally use SL4J and Logback implementations.
  • If the collector service is configured, it can pass the log information to it.

Adding Spring Cloud Sleuth

This is very simple. We need to update our pom.xml files to include the Sleuth dependency. Let's update api-gateway, service-a and service-b pom files with this.
<dependency>
    <groupId>org.springframework.cloud</groupId>
    <artifactId>spring-cloud-starter-sleuth</artifactId>
</dependency>

Now re-start applications and visit http://localhost:7060/api/service-a/

Look at the logs. In service-a you will be able to see;
2019-06-23 14:45:41.024  INFO [service-a,50937d2183890546,fc6079712896add8,false] 15445 --- [io-7000-exec-10] c.s.s.controller.MessageController       : get message

In service-b
2019-06-23 14:45:41.033  INFO [service-b,50937d2183890546,260506a7161eca33,false] 15654 --- [io-7005-exec-10] c.s.s.controller.MessageController       : serving message from b

You can see in logs it has [service_name, traceId, spanId, exportable] format. Both logs have same correlation id printed. Exportable is false because we haven't added our log tracing server yet.
Let's add it.

Zipkin Server

Zipkin is a distributed tracing system. It helps gather timing data needed to troubleshoot latency problems in microservice architectures.

We used Sleuth to add tracing information in our logs. Now we are going to use Zipkin to visualize it.

Zipkin server as a Docker container

We can find the docker command from the official site to run Zipkin server as a docker container.
docker container run -d -p 9411:9411 openzipkin/zipkin

Now our Zipkin server is available at http://localhost:9411/zipkin/

Let's add Zipkin dependency to api-gateway, service-a and service-b.

<dependency>
    <groupId>org.springframework.cloud</groupId>
    <artifactId>spring-cloud-starter-zipkin</artifactId>
</dependency>

Then we need to specify where to send our tracing data. Update application.yml in api-gateway, service-a and service-b as below.
spring:
  application:
    name: service-a
  zipkin:
    baseUrl: http://localhost:9411/
  sleuth:
    sampler:
      probability: 1.0

Re-start applications and visit http://localhost:7060/api/service-a/
You will be able see exportable true this time.
[service-a,58036b999b226786,6c9d3cb3ff99aa11,true]

Now visit http://localhost:9411/zipkin/ and click on 'Firnd Traces'. You will be able see tracing information and click on it.



Now if you click on a service name it will give you more information like below.


That is it for now. You have your base tracing module to play with.

References

https://spring.io/projects/spring-cloud-sleuth
https://zipkin.io/
https://microservices.io/patterns/observability/distributed-tracing.html

Saturday, June 22, 2019

Cracking Java8 Stream Interview Question

Greetings!

I have faced many interviews (too many to be frank) in my career as a software developer. In most of those interviews I have been asked to write a pseudo code for a given problem and implement it in Java. With Java8, this implementation mostly should be in Java8.
When I look back all those questions it can be simplified as below. (difficulty may be vary though).

  • Iterate over a given collection (stream)
  • Filter the given data (filter)
  • Transform into another format (map, reduce)
  • Collect data into a collection (collect)
  • or End the stream (forEach, min, etc)

Is this familiar to you? This should be. This is what we do in our daily work. But, if you are blindly using it you will see it as a difficult question to answer.
You need to have good understanding about intermediate and terminal operations. And, you need to really practise and use in daily work. It is meaningless to pass an interview without knowing these.
java-8-streams-intermediate-operations
java-8-streams-terminal-operations

Let's dive into some real questions.

Find the youngest male student by given list

Let's divide this into smaller parts.
- youngest -> min
- male -> filter
- list -> iterate
Student youngestmale = students.stream()
        .filter(student -> student.gender.equals("male"))
        .min((s1, s2) -> s1.age - s2.age) //.min(Comparator.comparingInt(s -> s.age))
        .get();

Find all numbers divisible by 3

This is an easy question. But you may remember IntStream. And also if you are asked to how to collect the result into a list you need to remember to convert int to Integer. For this IntStream has boxed operation. Let's break down it.
- divisible -> filter
IntStream.rangeClosed(0, 25)
        .filter(number -> number % 3 == 0)
        .forEach(System.out::println);

// collect the result
List result = IntStream.rangeClosed(0, 25)
        .filter(number -> number % 3 == 0)
        .boxed()
        .collect(Collectors.toList());


Find the sum of even number's power of two

This has multiple answers. It gets little tricky when you asked not to use sum operation. But still we have the same format.
- even numbers -> filter
- power of two -> map
- sum -> sum (doesn't exist in Stream but in IntStream)
or
- power of two and sum -> reduce
// method 1
int sum = IntStream.rangeClosed(0, 5)
        .filter(number -> number % 2 == 0)
        .map(number -> number * number)
        .sum();
System.out.println(sum);

List<Integer> numbers = Arrays.asList(1, 2, 3, 4, 5);

// method 2
int sum2 = numbers.stream()
        .filter(number -> number % 2 == 0)
        .mapToInt(number -> number * number) // convert to IntStream
        .sum();
System.out.println(sum2);

// method 3
int sum3 = numbers.stream()
        .filter(i -> i % 2 == 0)
        .reduce(0, (result, number) -> result + number * number);
System.out.println(sum3);

Find the average marks of a student

This looks difficult at first but very simple. It is because IntStream has average operation. We need to covert Stream into IntStream because;
- Autoboxing has a performance impact.
- sum, average operations are not in normal stream.
OptionalDouble average = subjects.stream()
        .mapToInt(Subject::getMarks)
        .average();

What do you think? Do you have any interesting interview question.



Friday, June 21, 2019

Angular - Let's create a starter project

Greetings!

Angular has many things to learn though we can skip those and directly create simple project. It will be harder to study all the features first. So here, i'm going to directly create a simple starter project.

This is the end result of this tutorial.



Install Angular CLI

npm install -g @angular/cli
ng version

Create a project

ng new angular-store --routing --style=scss
cd angular-store
npm install

This will create a basic app structure.

Run the project

ng serve --open

# short form
ng s -o

With zero code, we have a running template. That's some power.

Angular Material

Let's add material UI design into our project. For more information visit https://material.angular.io/
Below command will add angular material into our project and update neccessary files.
ng add @angular/material

We want to use material design component in our project. To add material components in a single place let's create a separate module named material and update it with neccessary materaial modules.
ng generate module material --flat

Now, we need to update app.module.ts to import our module.
import { MaterialModule } from './material.module';

  imports: [
    BrowserModule,
    AppRoutingModule,
    BrowserAnimationsModule,
    MaterialModule
  ]

Making responsive UI

I'm going to use bootstrap css grid to create resposive user interface.
npm install --save bootstrap

Update angular.json file's style section to include bootstrap grid.
"styles": [
  "./node_modules/@angular/material/prebuilt-themes/indigo-pink.css",
  "./node_modules/bootstrap/dist/css/bootstrap-grid.css",
  "src/styles.scss"
]

Create an app component

ng generate component home --module=app --spec=false

# short form
ng g c home --module=app --spec=false
ng g c about --module=app --spec=false

Adding a menu

Update app.component.html with below code. You can visit https://material.angular.io/components/toolbar/overview for more information.
<mat-toolbar color="primary">
  <span>Angular Store</span>
  <span class="spacer"></span>
  <button mat-button>Home</button>
  <button mat-button>About</button>
</mat-toolbar>

<router-outlet></router-outlet>

We need to import related modules in our material module.
import { MatButtonModule } from '@angular/material/button';
import { MatToolbarModule } from '@angular/material/toolbar';

Adding routes

When we create the project with --routing option, app-routing.module.ts is already created. What we need to do is specify our routes.
import { HomeComponent } from './home/home.component';
import { AboutComponent } from './about/about.component';

const routes: Routes = [
  {
    path: '',
    component: HomeComponent
  },
  {
    path: 'about',
    component: AboutComponent
  }
];

Then, we need to update our menu bar to our routes.
  <button mat-button [routerLink]="['/']">Home</button>
  <button mat-button [routerLink]="['/about']">About</button>

Showing products

This will be just hard coded for loop. We are going to use material card to show our content. https://material.angular.io/components/card/overview
Update home.component.js with below variable.
items = new Array(10);

Update home.component.html with below content.
<div class="row">
  <div class="col-xs-12 col-sm-6 col-md-4 col-lg-3 mt-10" *ngFor="let item of items">
    <mat-card>
        <img mat-card-image src="/assets/icon.png" alt="" />
      <mat-card-header>
        <mat-card-title>Lorem, ipsum dolor.</mat-card-title>
      </mat-card-header>
      <mat-card-content>
        <p>Lorem ipsum, dolor sit amet consectetur adipisicing elit. Aliquid, sapiente?</p>
      </mat-card-content>
      <mat-card-actions>
        <button mat-button color="primary">Add To Cart</button>
        <button mat-button color="primary">Read More</button>
      </mat-card-actions>
    </mat-card>
  </div>
</div>

Update styles.scss with mt-10 class.
.mt-10 {
    margin-top: 10px;
}

That's end of it. Even though we didn't touch deeper into Angular, this will give you something to play with.


Wednesday, June 12, 2019

Microservices - Api Gateway

Greetings!

I was too busy past couple of days and lost my way of continuing Microservices series. So I decided to add simple dummy services service-a and service-b. Other than that, there is no change except the github repo.

https://github.com/slmanju/simple-microservice

In Microservices architecture we are dealing with many apis which work along or work with other services. Imagine that we are going to create a mobiile client using these client it will be very difficult to manage all services by that client. Or, imagine we expose our apis so that anyone can create their own client.
This is a good place to introduce a another service which act as a gateway to all other services. Third party clients will know only about this api.
Not only this, we can solve some other problems using Api Gateway.

  • Single entry point to all the services.
  • Common place to log request and responses.
  • Authentication users in single place.
  • Rate limits.
  • Add common filters.
  • Hide internal services.


To create our api gateway, we are going to use Netflix Zuul.
Just like other Spring Boot libraries, this is also just a matter of adding the library and add related configurations.

First of all let's generate the project by adding necessary dependencies.
<<IMAGE>>

You can this block in maven pom
<dependency>
    <groupId>org.springframework.cloud</groupId>
    <artifactId>spring-cloud-starter-netflix-zuul</artifactId>
</dependency>

Then we need to mark this service as our Api gateway by adding @EnableZuulProxy.
@EnableEurekaClient
@EnableZuulProxy
@SpringBootApplication
public class ApiGatewayApplication {

    public static void main(String[] args) {
        SpringApplication.run(ApiGatewayApplication.class, args);
    }

}

Now our gateway is ready. Let's add the port the service name in application yaml file.
server:
  port: 7060

spring:
  application:
    name: api-gateway

Then we need to tell where is the discovery service by adding Eureka properties.
eureka:
  client:
    registerWithEureka: true
    fetchRegistry: true
    serviceUrl:
      defaultZone: http://localhost:7050/eureka/

If we start our services now, others services are available by their service name.
http://localhost:7060/service-a/

Let's override this and add our own name.
zuul:
  ignored-services: "*"
  routes:
    service-a:
      path: /api/service-a/*
      serviceId: service-a

Now our service is avaible at http://localhost:7060/api/service-a/


Wednesday, May 1, 2019

Kafka - Spring Boot

Greetings!

In this blog post i'm going to explain how to integrate Kafka with Spring boot. We use Spring boot configuration to send Kafka message in String format and consume it. Let's begin.
(complete example can be found here.)

Starting up Kafka

First of all we need to run Kafka cluster. For this i'm using landoop docker image.
Here is the docker command to run landoop docker container.
docker container run --rm -it \
-p 2181:2181 -p 3030:3030 -p 8081:8081 \
-p 8082:8082 -p 8083:8083 -p 9092:9092 \
-e ADV_HOST=127.0.0.1 \

landoop/fast-data-dev

Generate the application

I'm using Intellij idea IDE to generate the Spring boot application. I have selected web, lombok and
kafka dependencies.
Let's rename application.properties to application.yml to use yaml format.
Here is my application.yml file configuration values.
server:
  port: 9000

spring:
  kafka:
    producer:
      bootstrap-servers: localhost:9092
      key-serializer: org.apache.kafka.common.serialization.StringSerializer
      value-serializer: org.apache.kafka.common.serialization.StringSerializer
    consumer:
      bootstrap-servers: localhost:9092
      group-id: test-id
      auto-offset-reset: earliest
      key-deserializer: org.apache.kafka.common.serialization.StringDeserializer
      value-deserializer: org.apache.kafka.common.serialization.StringDeserializer
It is pretty simple. I have specified running Kafka instance url in boostrap-servers. For key and value serializers I use in built StringSerializer. To deserialize the message I use StringDeserializer provided by kafka.
  • bootstrap-servers - kafka server instance
  • kafka.consumer.group-id - consumer group id which will be used by consumers.
  • kafka.consumer.auto-offset-reset - consumers will start reading messages from the earliest one available when there is no existing offset for that consumer.

Kafka Configuration

We already have configured basic properties. In addition to that we are going to create our Topic.
@Configuration
public class KafkaConfiguration {

    public static final String TOPIC_NAME = "kafka-spring";

    @Bean
    public NewTopic topic() {
        return new NewTopic(TOPIC_NAME, 3, (short) 1);
    }

}
Kafka's AdminClient bean is already in the context. It will create a topinc using NewTopic instance which we have given kafka-spring as the topic name, number of partions as 3 and replication factor is 1.

Produce Messages

Spring provides easy to use KafkaTemplate to send messages to Kafka. We need to provide topic name and our message.

@Component
public class KafkaMessageProducer {

    private static final Logger LOGGER = LoggerFactory.getLogger(KafkaMessageProducer.class);

    private final KafkaTemplate kafkaTemplate;

    @Autowired
    public KafkaMessageProducer(KafkaTemplate kafkaTemplate) {
        this.kafkaTemplate = kafkaTemplate;
    }

    public void send(String message) {
        LOGGER.info(String.format(":: Produce Message :: %s", message));
        kafkaTemplate.send(TOPIC_NAME, message);
    }

}


Consume Messages

With Spring's KafkaListener, we can easily consume messages by specifying topic name and group id.

@Component
public class KafkaMessageConsumer {

    private final Logger LOGGER = LoggerFactory.getLogger(KafkaMessageConsumer.class);

    @KafkaListener(topics = TOPIC_NAME, groupId = "test-id")
    public void consume(String message) {
        LOGGER.info(String.format(":: Consume Message :: %s", message));
    }

}


Test it

Now all set. Let's create a simple end point to send few messages.

@RestController
public class MessageController {

    private final MessageService messageService;

    @Autowired
    public MessageController(MessageService messageService) {
        this.messageService = messageService;
    }

    @PostMapping("/send")
    public void sendMessage(@RequestBody Message message) {
        messageService.sendMessage(message.getText());
    }

}


curl -X POST \
  http://localhost:9000/send \
  -H 'Content-Type: application/json' \
  -d '{ "text": "Hello Kafka" }'


See the console. You should be able to see something like this.

2019-05-01 22:09:28.062  INFO 6045 --- [nio-9000-exec-4] c.s.k.message.KafkaMessageProducer       : :: Produce Message :: Hello World
2019-05-01 22:09:28.069  INFO 6045 --- [ntainer#0-0-C-1] c.s.k.message.KafkaMessageConsumer       : :: Consume Message :: Hello World


Navigate to http://127.0.0.1:3030 and select topics where you can see our topic.


References

https://spring.io/projects/spring-kafka

Monday, April 22, 2019

Docker - Dockerfile

Greetings!

We used Docker images to create containers multiple times. We used images from Docker Hub to create those containers. Ever wondered how to create a Docker image? Docker can build images automatically by reading the instructions from a Dockerfile.

Dockerfile

A Dockerfile is a text document that contains all the commands a user could call on the command line to assemble an image. Think of it as a shellscript. It gathered multiple commands into a single document to fulfill a single task.
build command is used to create an image from the Dockerfile.

$ docker build .
You can name your image as well.
$ docker build - my-image .

Let's first look at a Dockerfile and discuss what are those commands.
This is extracted from official MySQL Dockerfile.

FROM debian:stretch-slim

# add our user and group first to make sure their IDs get assigned consistently, regardless of whatever dependencies get added
RUN groupadd -r mysql && useradd -r -g mysql mysql

RUN apt-get update && apt-get install -y --no-install-recommends gnupg dirmngr && rm -rf /var/lib/apt/lists/*

RUN mkdir /docker-entrypoint-initdb.d

ENV MYSQL_MAJOR 8.0
ENV MYSQL_VERSION 8.0.15-1debian9

VOLUME /var/lib/mysql
# Config files
COPY config/ /etc/mysql/
COPY docker-entrypoint.sh /usr/local/bin/
RUN ln -s usr/local/bin/docker-entrypoint.sh /entrypoint.sh # backwards compat
ENTRYPOINT ["docker-entrypoint.sh"]

EXPOSE 3306 33060
CMD ["mysqld"]

As you can see, this is how you install MySQL in your Linux machine. First we select our OS and install necessary software. Then configure the environment. All those instructions are added into Dockerfile using Docker specific commands.

Dockerfile Commands

  • FROM - specifies the base(parent) image.
  • RUN - runs a Linux command. Used to install packages into container, create folders, etc
  • ENV - sets environment variable.
  • COPY - copies files and directories to the container.
  • EXPOSE - expose ports
  • ENTRYPOINT - provides command and arguments for an executing container.
  • CMD - provides a command and arguments for an executing container. There can be only one CMD.
  • VOLUME - create a directory mount point to access and store persistent data.
  • WORKDIR - sets the working directory for the instructions that follow.
  • LABEL - provides metada like maintainer.
  • ADD - Copies files and directories to the container. Can unpack compressed files.
  • ARG - Define build-time variable.

COPY vs ADD

Both commands serve the similar purposes. Copy files into the image.
COPY let you copy files and directories from the host.
ADD do the same. Additionally it lets you use URL location and unzip files into image.
Docker documentation recommends to use COPY command.

ENTRYPOINT vs CMD

CMD - allows you to set a default command which will be executed only when you run a container without spedifying a command. If Docker container runs with a command, the default command will be ignored.
ENTRYPOINT - allows you to configure a container that will run as an executable. ENTRYPOINT command and parameters are not ignored when Docker container runs with command line parameters.
what-is-the-difference-between-cmd-and-entrypoint-in-a-dockerfile

VOLUME

You declare VOLUME in your Dockerfile to denote where your container will write application data. When you run your container using -v you can specify its mounting point.
difference-between-volume-declaration-in-dockerfile-and-v-as-docker-run-paramet