Category: Architecture & Developing

InHash – Visual Studio Code Extension to Compute Hashs and Checksum

InHash – Visual Studio Code Extension to compute hash and checksum

I just developed the visual studio code version of the InHash plugin already developed for IntelliJ and Eclipse.

This plugin computes hash and checksum for files and selected text.

It started to be a necessity for a project where I was involved and “why not to create a plugin from it?”

Find it here: Visual Studio Code Extension to compute Hashs and Checksum

Embedding Kotlin Playground

Jetbrains is providing plugins and tools to embed Kotlin in any website
– you can update and run the code clicking the play!

Just try it!

Error: java.lang.NoClassDefFoundError: com/fasterxml/jackson/annotation/JsonInclude$Value

java.lang.NoClassDefFoundError: com/fasterxml/jackson/annotation/JsonInclude$Value

If this error occurs try addind the following dependencies to your pom.xml
Usually jackson needs these 3 dependencies:

  • databind
  • core
  • annotations

    com.fasterxml.jackson.core
    jackson-databind
    2.9.0

		

    com.fasterxml.jackson.core
    jackson-core
    2.9.0



    com.fasterxml.jackson.core
    jackson-annotations
    2.9.0

junit 5, maven dependencies

JUnit 5 and Maven Dependencies

Straight to the point:

You need at least these dependencies:


    org.junit.jupiter
    junit-jupiter-api
    5.1.0
    test



    org.junit.jupiter
    junit-jupiter-engine
    5.1.0
    test

There are other dependencies for IDEs and older versions of jUnit if you need them:


    org.junit.vintage
    junit-vintage-engine
    5.1.0
    test
    

    org.junit.platform
    junit-platform-launcher
    1.1.0
    test


    org.junit.platform
    junit-platform-runner
    1.1.0
    test

Jenkins Pipelines and the Blue Ocean plugin

With the big wave of DevOps, Pipelines are one of the most used words. Why? Because continuous integration and continuous delivery are about pipelines.

Jenkins is one of the most used components in a Continuous Integration environment, and its community decided to take into this wave too. How?

Well, redefining how the pipelines are configured and developed in Jenkins.

The new pipelines are developed using Groovy Language, which opens a very wide spectrum of possibilities of configurations to the integration process. (so… who said Groovy lang was dying?)

There are 2 ways of developing pipelines in Jenkins, the Declarative and the Scripted way

In the Scripted approach you use the full power of Groovy language, however, the blue ocean plugin will not be used as expected and you cant visualize the pipeline. More on this later.

node {
    stage('chekout from git') {
        
        checkout([$class: 'GitSCM', 
                branches: [[name: '*/master']], 
                doGenerateSubmoduleConfigurations: false, 
                extensions: [], 
                submoduleCfg: [], 
                userRemoteConfigs: [[]]
        ])
        try {
            sh 'mvn test'
        } catch (e) {
            currentBuild.result = 'FAILURE'
            throw e
        } finally {
            cleanWs cleanWhenFailure: false
        }
    }

    stage('build the project') {
        steps {
            sh 'mvn -B -DskipTests clean package'
        }
    }
}

But it’s the declarative way that draws the pipeline flow as we expected.

This approach brings first an elegant structure of the code and the use of the Blue Ocean plugin where we will visualize the pipeline flow, literally!

An example of a declarative pipeline can be something like this:

pipeline {
    agent none 
    stages {
        stage('chekout from git') {
            steps {
                checkout([$class: 'GitSCM', 
                    branches: [[name: '*/master']], 
                    doGenerateSubmoduleConfigurations: false, 
                    extensions: [], 
                    submoduleCfg: [], 
                    userRemoteConfigs: [[]]
                ])
              }
        }
        stage('build the project') {
            steps {
                sh 'mvn -B -DskipTests clean package'
            }
        }
    }
}

If you install the Blue Ocean plugin and run a declarative pipeline you will see something like this:
(this example was taken from Jenkins website)

Jenkins Pipeline Blue Ocean Plugin

So, should I use Declarative or Scripted Pipelines?
– Scripted Pipelines use the full Groovy language, but you will not have the enhanced visualization.
– Well, declarative is a little more restricted in the structure of the code, but you can always use the step “script { }” to extended the use of Groovy. But you will use the Blue Ocean plugin, something that has its value in a workspace for instance.

The Jenkins website as a good tutorial to follow, please visit it to move forward in this technology.

Hope this post was enough to capture your attention about how Jenkins can have a really beautiful and useful visualization of a pipeline.

Tip: Java 8 Streams flatmap map lists

A little example, I had to develop as a teaching use case, of how to take a map with Integer keys and values with a List as value and find a specific value in any List in the whole map using Java Streams.

boolean b1 returns true because the element exists
boolean b2 is the response to an element that doesn’t exist

Go Language, First Steps

Go, first experience.

What

We are living in a very rich IT ecosystem, where developers aren’t afraid to explore new languages, tools and to use out of the box thinking. It’s a great time to living in this IT era! Following this trend, and due to some need I had to start to learn
a new language, GO.

Go, also called golang, was developed at Google. The motivation was to have a language less complex than C++.

Go has great advantages, such as:

  • Concurrency system, far superior than Python, for instance.
  • Compiled language, which makes it faster
  • The executable is static linked, which allows to create an executable for the destination platform. It means, it doesn’t need an interpreter.
  • strongly handling exceptions
  • Low memory footprint
  • Very easy to learn
  • Docker is built in Go

Starting

Installation

Just visit the Golang web site
The binaries are here.

Baby steps

First things first: IDEs

Learning

I started with the “official” tutorial A Tour of Go

The big points:

  • Every Go program is made up of packages
  • Programs start running in package main
  • By convention, the package name is the same as the last element of the import path
  • In Go, a name is exported if it begins with a capital letter
  • In functions, the type comes after the variable name
  • When two or more consecutive named function parameters share a type, you can omit the type from all but the last
  • A function can return any number of results
  • Go’s return values may be named. Also called naked return
  • The var statement declares a list of variables (package or function level)
  • Inside a function, the := short assignment statement can be used in place of a var declaration with implicit type
  • The expression T(v) converts the value v to the type T (casting)
  • Constants are declared with the const keyword
  • loop: for – like c, java, c++ but without the parentesis
  • The while> in Go its the for. for {} = infine loop
  • if doesnt need the parentesis
  • the if> allows a statement before the condition: if v := math.Pow(x, n); condition {
  • switch doesnt have the “break”, it is implicit
  • switch without a condition is the same as switch true. Good for long sequence of “if”
  • A defer statement defers the execution of a function until the surrounding function returns. defer add(2,3) PS: call’s arguments are evaluated immediately
  • Go has pointers!
  • The type *T is a pointer to a T value. Its zero value is nil.
  • The &var operator generates a pointer to its operand.
  • Unlike C, Go has no pointer arithmetic.
  • A struct is a collection of fields “type Complex struct”
  • arrays: var a [10]int
  • slices: var s []int = primes[1:4] – A slice does not store any data, it just describes a section of an underlying array
  • A slice has both a length (number of elements it contains) and a capacity (the number of elements in the underlying array)
  • Slices of slices
  • A map maps keys to values: var m map[string]int
  • Go functions may be closures. return func(x int) int { return x }
  • Go does not have classes. However, you can define methods on types. func (v Vertex) Abs() float64 {}
  • An interface type is defined as a set of method signatures. type Abser interface {Abs() float64}
  • Interfaces are implemented implicitly
  • The empty interface – may hold values of any type
  • A type assertion provides access to an interface value’s underlying concrete value. t := i.(T)
  • ype switches switch v := i.(type) { case T:…
  • Go programs express error state with error values. type error interface {…
  • The io package specifies the io.Reader interface.
  • A goroutine is a lightweight thread managed by the Go runtime. go f(x, y, z)
  • Channels are a typed conduit through which you can send and receive values with the channel operator, <-.
  • Like maps and slices, channels must be created before use: ch :=make(chan int)
  • Channels can be buffered.
  • The select statement lets a goroutine wait on multiple communication operations.

Conclusion

In a first impression I would say its a very good language for scripting. No complications around object oriented concepts. Just got the best of native languages and modern languages. While reading some articles everyone has the same option, Go its very
suited for concurrent applications, since its very easy to use channels, select and goroutines.
For sure, a very good challenger to my usual preference on python

dynamic and ExpandoObject

tree_volteretaThe dynamic and ExpandoObject have a relation of love and hate with .NET developers. One side defends strongly typed objects to handle data, so it’s possible to handle a lot of issues at compile time. The other side, are developers used to develop scripts, python, and in other similar languages. They defend the dynamic nature of the current projects, with a lot of Xml, Json and unstructured data, so they prefer to use the more dynamic and functional structures of C#. Both are right! Some like to live strong typed and predictable, others like to live on the edge!

Personally, I am a hybrid developer, I like the strong type of languages, but love python and dynamic freedom, like C# provides. So it’s usual for me to use strongly typed objects on my projects, but every time I have to deal with unstructured data, I use the dynamic word and other constructs like Tuple class.

Because sometimes there are too many DTOs on the project!

ExpandoObject Basic Usage

ExpandoObject with methods

Reflection on the ExpandoObject

Why non-requirements are “more” important than requirements

quality attributesWhen I was studying more deeply software architectures, I learned about the most important thing to consider in an architecture (after stakeholders), the non-functional requirements. Yes! Not the functional requirements, but the non-functional requirements.
Before I explain this more careful, let me say what “I” call requirements and non-requirements. (I will use requirements from now on, meaning functional requirements and the same applies for non-requirements)
Requirements are the features the final customer wants. Something like, I want an intranet portal where all the enterprise information is presented, where the users can consult news about the company and where the users can request material, etc. Preference, very detailed, the project manager and all the developers thank you for that.
Non-requirements are the “features”/attributes not related with the business, but very important, like performance, reliability, modifiability, security, auditing, transactional, availability, interoperability, testability, usability and others.
Sometimes people mix requirements with non-requirements, and sometimes they are right. If I want to build a race car, probably performance is a requirement.
The non-requirements are also called quality attributes because they give quality to the architecture. You can build an ugly/raw intranet portal with all the requirements in there. You will have all the information and all the requirements you requested to be implemented, but as soon you present the final product to the end users you will find why the non-requirements are so important. Probably you will call soon a designer to bring some quality to the web pages, Usability. Then you will start to hear some complaints about the performance, a user opens a page and all that information takes a long time to appear. It is time to for the non-requirement, Performance. The users they need important information available on the portal, to do their work along the day, so you will want Availability. The portal should access information available in other systems, and then again, Interoperability, and we can go on and on about this.
Why is this important, and really important? These non-requirements or quality attributes change the way something is built. They should be considered before the development, for many reasons: technical, time, resources and financial, to say at least. Technical, because the architect should consider them in the design and development process. Time, because it will take more time to implement them that just code the raw functionality. Resources, because you will need to consider people expert in some areas, like designers, integration, etc. Financial, as a consequence of the others.
For instance, for usability you should consider experts in that area, like designers. For security, one of the most intrusive quality attributes, you probably need an expert in that difficult area, or consider what blocks of code or modules should be protected and how. It can be a demanding task. For interoperability, you will have to consider what information to consume and provide, protocols, security, etc. If you work in financial software, you will surely want to test deeply the software, so the code and the architecture should be testable, the code will be organized for testability, and a lot of extra code will be implemented for testing purposes.
When you buy a house, the attributes you will take into your decision are more about quality than the living ones. If you follow only the living ones, any space with some blocks adjacent to each other will be enough for living! But, you will consider the access (security), how easy is to move around the house (usability), how strong are the house, like walls, material, etc. (reliability), and if the house is custom build for you, most of the time you will be speaking to the architect to consider this and that quality attribute, you will not be telling him that the house if just for living there, sleep, being and eat. You will want more, you will want quality, you will want a house based, above all, in non-requirements, responsible for making your expected requirements richer.
Curiously, the acceptance of any project, will be based on how the requirements were implemented served by the richness brought by the non-requirements. At least if you want a customer happy.
Without the non-requirements, the requirements will, paradoxically, not be “acceptable” by the final customer.

Architecture – Its all about stakeholders – Part III

jigsaw

In the part I of these series it was introduced the importance of the stakeholders in the architecture. In Part II we have learned how to communicate with them, using views and viewpoints. We will now focus on the concept of perspectives, the non-functional requirements in our architecture but usually the most important, like security, audit, logging.

Perspectives

In the previous parts, we have talked about architecture elements, and how to show them to stakeholders in a way understandable for all the parties. By now, we have an initial structure of the whole system, but it misses the most important things in the architecture, and usually the more challenge to include, the quality attributes, or sometimes called cross-cutting concerns, like security, performance, logging, audit, scalability, etc.
These attributes of the architecture are always very challenge to include because they are orthogonal in the architecture. For instance, logging is something we want all over the architecture, probably in all modules, tiers, layers, etc, which implies changing all views already developed.
The name was given to these attributes in the context of an architecture its a little bit controversial, because there are entities that consider these attributes additional views of the architecture, and other entities consider something that change the views, orthogonal in the views, something that complements and can change an entire viewpoint. So it is natural to find these cross-cutting concerns named as views or perspectives. I, personally, like the term perspective, because these attributes are not another kind of views in the architecture, but the application of “quality” in the architecture, the insertion of elements that enrich the system. Like in a house, besides the construction we can improve the whole house with a better light system, internet all over the house, better walls, toilets with material that don’t get rusty, etc.
Why is this important for stakeholders and why we need them? Because they are usually the most important concerns! For instance, in a Bank, security is probably the attribute most important, and I bet almost every requirement includes security, explicitly or implicitly. The people who give support surely they want logging and audit to help them finding the origin of issues or someone who tried to access a resource illegally. In a website like amazon, the end users surely they want a very performant website, and so amazon wants a scalable system, for the system to grow big and in a smooth way.
The quality requirements, are the core of the requirements. They give color and form to the functional requirements. You can create a web page to register a user, but you will want to include audit, logging, security (like HTTPS) and manage the load balancing and scalability, not forgetting about sticky sessions or another architecture tactic (different from patterns).
With all these views, viewpoints and perspectives, you can think the architect will have a fragmented view of the whole architecture, which is true, but someone who wants to be an architect must be prepared for this challenge, find the best way to build a system with all these pieces, that’s why he/she will be an architect. It is not about the technical skills, only, but about choices and stakeholders.

Bad Behavior has blocked 51 access attempts in the last 7 days.

Hyper Smash