Wednesday, June 18, 2014

Why Should We Dump Java EE Standard?

Prologue

I never thought that I have to write about this topic again, but I have to since a couple months ago I have to justify a decision of using Spring Framework in an enterprise environment. The decision to be made was whether we are going for Java EE or using Spring Framework. The constraint was very clear:
  • We need to support both batch and web apps.
  • For web apps we only can use an old OAS (Oracle Application Server) with Java EE 1.4.
  • Commercial support has to be available for the framework and the runtime environment.
  • We are going to update the app server for Java EE 6 or 7 soon, so it should be possible to have a smooth migration.
  • We would like to minimize the use of our home grown frameworks.
Still there was such a quarrel in the team because there are already some Java EE web apps run in OAS upgraded with JBoss Weld CDI 1.0. Normally JBoss Weld 1.0 won't run out of the box in OAS but with some patches we got it run. The problem with all those web apps is we had a lot of our own CDI extensions to fulfill enterprise features (declarative transaction management for services, caching, security) which already a part of Spring Framework a long time ago.

So it's time to google this topic "Java EE vs. Spring Framework" to see whether the dispute between them still exists.

Spring Framework Bashing 2012 and 2013

I found in Google a lot of Spring Framework bashing in years 2012 and 2013. There are some of blogs and articles saying that with the existence of Java EE - especially Java EE 7 - we don't need or should avoid Spring Framework. One of the most controversial article is this one: Why is Java EE 6 better than Spring? Luckily there are also some counter parts like this one: Java EE a tough sell for Spring framework users and this one Java EE vs Spring. Or: What is a standard?

So the topic is still hot!

Base on these discussions I need to make my own picture so let's get the two important aspects of enterprise application development: programming models and runtime environments.

Programming Models

Following enterprise application types should be supported from both stacks Java EE and Spring Framework:
  • Web apps
  • Batch apps
At last Java EE 7 supports both application types. Although the support of batch apps is still very young you should be able to define centralized business logic and can use it in both web and batch apps. The biggest problem I see with Java EE 7 so far are: 
  • Since Java EE 7 and JTA 1.2 business logic component with transaction demarcation does not need to be an EJB component  anymore. But what about asynchronous method execution? For this purpose you still need Message Driven Beans (MDB) in EJB container. So EJB is still alive. This is where Spring Framework has its advantage. Everything like business logic components with asynchronous method execution and utilities is always POJOs, no different at all and it is available today. The counterpart of MDB is MDP (Message Driven POJOs) in Spring Framework.
  • Security mechanism like authentication and authorization in Java EE 7 (JAAS) is still inflexible. If you use JAAS it is dependent on the chosen Java EE container. In contrary Spring Security is flexible and you can use it in all runtime environments available. The good news is that Spring Security can be integrated easily in Java EE 7 apps.  
  • Since Spring Batch 3.x supports Java EE 7 batch standard and this framework is the longest batch framework available in the market you maybe will use Spring Batch as your Java EE batch implementation. It is an additional complexity in case I would use Spring Batch 3.x and I need to reuse business logic components written in EJB. Do I have to run my Spring Batch app within an EJB or Java EE container? Using pure Spring Batch makes everything simpler and easier.

Runtime Environments

Application server is the platform or the runtime environment where your Java EE applications can be deployed and executed. Looking at the Java EE application servers market you will notice that following has happened last year:
So actually the platform where you can run Java EE applications is getting very few:
  • Open Source application server with commercial support and reasonable price can only be found from JBoss / Wildfly and TomEE. Also a Java EE batch app needs a Java EE container.
  • Apps based on Spring Framework can run everywhere (pure Servlet or Java EE containers like Jetty, Tomcat, VMware vFabric tc Server, now Pivotal tc Server, JBoss, Wildfly, TomEE) as long as your apps can access the implementations of services like JMS, transaction and cache. Spring Batch apps can run just within the plain Java SE.

Your Options

So in 2014 if you ever need to start an enterprise project what kind of enterprise platform will you use? If you want to write easy, secure enterprise apps (web and batch) with a single Java programming model which can be executed in many runtime environments, Spring Framework is still the one and only one choice you have.

The Problem with Java EE and My Solution: "One Runtime Environment with Standardized APIs and Simple Implementation Dependencies"

The idea to standardize some mechanisms using Java API is fine. It is good to standardize persistent mechanism with JPA, it is good to standardize dependency injection with CDI, messaging with JMS, Batch programming model with JSR-352 and others. What we don't need is the umbrella standard Java EE which puts a lot of those APIs together. Take a look at this page to see the content of Java EE. So what do we actually need?
  • We only need one runtime enviroment standard. That is what we know today with Servlet container (Tomcat, Jetty and others). This is the one and only application server or operating system we need for our enterprise applications.
  • We have all those standardization of APIs like JPA, JMS, CDI, Batch and others. The specification of those APIs should be completely loosely coupling. At the end as an end user you want to mix and match those APIs as you need them. The implementation of those APIs can be done like today through a normal framework implementation just like Hibernate (JPA), ActiveMQ (JMS), JBoss Weld (CDI), Spring Batch (Batch) and others.
That's it! No need to have those Java EE runtime environments like JBoss, Weblogic or Websphere. Of course they can still bundle all those frameworks together like what they already have done today but the most important point is actually that you can mix and match the implementations with different specification versions. Today it is impossible to do so. If you have a Java EE application server which supports Java EE 1.4 it is almost impossible to use JPA 2.1. 

Also if you have a lot of web and batch apps in production you cannot update all of them at once. Update has to be planned carefully. If you have Java EE 1.4 container in production you will stick to it in long term, since your traditional operation won't accept if you want to use different kind of containers for example Java EE 1.4 and Java EE 7 in production. If you want to move to a new Java EE 7 container you need to migrate all of your web apps at once and this is in a normal enterprise situation almost impossible. You can only update a web app within a project and you have limited resources to execute your project. So to use just a simple container and put all the implementation dependencies in the web app is the way to go. In this case you can use up-to-date APIs in some web apps. You don't need to update all the web apps just to be able to use up-to-date APIs in some new web apps. 

To conclude: the umbrella Java EE specification which contains all those APIs also makes everything more complex and makes update to a newer version of APIs very slow. 

Spring Framework supports the idea of one runtime environment and mix and match APIs since the beginning:
  • You can use any runtime environments or containers which supports Servlet specification like Tomcat, Jetty or JBoss or others.
  • You can mix and match all those standardized APIs like JPA, JMS, Batch (JSR-352), CDI (not complete but some of the specs like JSR-330 and JSR-250). 
  • To use the APIs you have to include the implementations of the API specifications by yourself using standardize dependency mechanism for Java.
  • You get a lot of nice helpers to "glue" APIs together to build a nice programming model on the top.
So the ideal situation would look like this picture below:
  • Web app: the runtime environment (Web Container) does not include all the APIs implementations. The web app needs to include the dependencies by themselves (WEB-INF/lib directory).
  • Batch app: no special runtime environment, just standard JVM. The batch app needs to include the dependencies by themselves (-classpath in Java execution parameter). 

Epilogue

In my opinion the enterprise development in Java has to go in the direction above:
  • Drop the umbrella standard Java EE.
  • Concentrate on one and only one runtime environment specification, the Servlet specification.
  • Concentrate on APIs standardization like JPA, JTA, JMS, CDI, Batch and others and make them loosely coupled. Let the version of the APIs to be used mixed and matched.
  • Use the standard dependency mechanism of Java to include the implementations within the web and batch apps.
  • Don't forget the Security APIs, just copy them from Spring Security analog to Batch APIs using Spring Batch.
So at the end as an end user we have one and only one runtime environment (container). The rest is just standardized APIs (JPA, JMS, CDI, Batch, Security) with their implementations (Hibernate, ActiveMQ, Spring DI, JBoss Weld, Spring Batch, Spring Security) which can be used mixed and matched as you need them. With this style you can update the version of particular specification without having to update the whole runtime environment and all the APIs in one step. A new developed app can use the up-to-date APIs and their implementations. Update of older web and batch apps can be planned carefully without the need to update all the apps at once.

At the end we chose Spring Framework. We will use all available standardized APIs with the best implementations we can get. To be able to mix and match the version of APIs in each web and batch app we will manage the dependencies within the apps itself using standardize Java mechanism.


Updates: 
  • 27.06.14: The term should be Java EE instead of JEE. Please see: http://goo.gl/8xfMsJ and http://goo.gl/NnCHy4
  • 27.06.14: In the mean time you can follow discussions about this topic at: JavaLobby (http://goo.gl/KmqeWm) and TheServerSide (http://goo.gl/oQmWRG)

Sunday, April 27, 2014

Smart Home Sweet Home with Gigaset Elements?

Introduction
Last week I had a chance to try the Smart Home solution from Gigaset Elements. I read some articles about this product which said how easy to install this product for dummy users. Those articles woke my interest and I began to google products for Smart Home solutions.

In this article (German language) you will find a nice overview about some products for Smart Home, which can be bought in Germany. The Nest product from Google is still not available in Germany. Although it seems that RWE will offer Nest products in Germany in couple of months.

The installation of Gigaset Elements was really easy. The problem I encountered was to add the siren sensor. I had to push hard the button on the siren sensor at the same time with the button on the base, so that they can communicate with each other. After about one hour I managed to install everything properly.

Points to mention
Generally the idea is very nice. Gigaset Elements try to push KISS (Keep it Simple Stupid) principle. However I found some points that Gigaset Elements needs to optimize. Here are the points:
  • Reliability: The cloud was already down for two times and for almost one day each. See this facebook discussion and Gigaset Elements blog. Again, two times and within this time you are just offline! A no go situation for a cloud solution.
  • Security: Gigaset Elements web access only offers username and password to login, no Two-step Verification. It is not necessary that Gigaset Elements has to offer two-step verification with SMS (Short Messages), it would be just enough to offer this authentication method with app like Google Authenticator. It is pretty easy to integrate Google Authenticator in your own web apps. In this article you can find a lot of webapps which already support Google Authenticator.
  • Openness: Gigaset Elements should offer Open APIs (Application Programming Interfaces) for all users. Just take an example of Google+ APIs. In Gigaset Elements context they should offer both types of APIs: Inside-Out and Outside-In Integration Services and Extensions. It would be pretty useful to be able to extend the capabilities of Gigaset Elements. Something like "turn on the holiday mode in 30 minutes" or "holiday mode ends on 03rd. of Februar 2014 at 4pm". This all will be possible if there is Open APIs access to their services.
  • Freedom of Choice: Cloud and at the same time Stand-alone solution should be easily possible. At the end the web servicesweb app and mobile apps (Android and iOS) are just normal applications. I assume that Gigaset Elements uses following application architecture (see Gigaset Elements System Architecture below). There is no problem to offer all the webservices as stand-alone product. Gigaset Elements should also Open Source all of the apps (webservice, web, Android, iOS), so that a bigger community can make all the apps better. At the same time all the users can have their own choice to host their own apps and services by themselves or use the cloud installation from Gigaset Elements. Not everyone would host their own apps and services!
Gigaset Elemensts System Architecture

Facts, APIs
Looking into the webapp of Gigaset Elements you will find that it uses actual architecture with RESTful webservice and use JavaScript (AngularJS) for the front end. Following APIs calls (service calls) from the JavaScript to the RESTful services can be seen:

https://api.gigaset-elements.de/api/v1/me/events?limit=25
https://im.gigaset-elements.de/identity/api/v1/user/info

https://im.gigaset-elements.de/identity/api/v1/openid/provider/?openid.return_to=...
https://api.gigaset-elements.de/api/v1/auth/openid/checkauth?gigaset.return_to=...

It is very good way that Gigaset Elements uses OpenID and OAuth to identify the user and access to the APIs. This also means that they could easily open their APIs to public because the infrastructure is already available.

Interesting is to see the access from the base into their web services. After plugging in the network cable from my laptop to my switch and use Man in the Middle software for Windows called Cain and Abel with Wireshark you can see following calls, just after you make a move in the front of your motion sensor:

https://api-bs.gigaset-elements.de/...

Wireshark Analysis from base to router

Smart Home Sweet Home?
This is just the beginning of Smart Home solutions. Gigaset Elements with its sensors is a great KISS idea. The more sensors (video cam, smoke detector, water sensor, programmable power outlet, programmable thermostat, etc.) will be available on the market of smart home solutions the more users will be attracted to try such solutions in their houses. Can you imagine what you can do with a Gigaset Elements HD video cam which will be available in summer 2014?

One important thing for all those Smart Home vendors, also Gigaset Elements, the one which offers "Open System" will always win! So Gigaset Elements, please hear my voice and open your system, make your applications Open Source. The chance to sell millions of your sensors world wide will be better afterwards!

Monday, February 03, 2014

Nobody can save Microsoft Mobile and Tablet Devices, also not Sundar Pichai

As we heard from the news, it seems that Sundar Pichai is a hot candidate for Microsoft CEO. In my opinion it does not play any role who will become Microsoft CEO, nobody can save Microsoft with all those Windows mobile technologies. Why?
  • Windows Phones or tablets with its operating system is not a bad thing but who needs another mobile or tablet devices? We have enough offering from Apple with iOS and Android offering. I know that in the beginning everybody says "who needs another browser". At the end Google Chrome is very successful thanks to Sundar Pichai. One thing makes here a big difference: Google Chrome is an Open Source product. If Sundar can build Open Source Windows operating system, then we will see a different story. Open Source and Microsoft is just a tough story, although they offer CodePlex for hosting Open Source projects. But wait a second: Microsoft Xbox is very successful hitting Sony Playstation and Nintendo? Yes, this is just analog to Google Chrome. Xbox competes with Playstation and Nintendo on the same level, just like Google Chrome and Firefox. But Windows Mobile is completely closed and controlled by Microsoft whereas Android is open and can be used by everyone. This makes a huge difference.
  • Windows on desktop is still the one and only product which Microsoft can be proud of. Apple has done a great job taking all the developers to use Macbook but a normal user still prefers Windows. So in this area nothing will change a lot. Only all those normal users change the style of working just by using tablet instead of notebook or desktop pc and this is the challenge Microsoft will face in coming years. What can Sundar Pichai do about this? Absolute nothing.
  • The environment (workers, culture) where you work plays a huge role on the success of your product. This is a fact. Marissa Meyer was successful at Google. Take a look now how she is doing at Yahoo, not really wow. Why? Because you just have another working environment! One person can never ever save a company! So how can Sundar Pichai rescue Microsoft with its mobile strategy? He cannot, he will just look bad, just as bad as her ex collegue Marissa Meyer.
So, what Microsoft should do about their mobile strategy? Forget it, close it. You cannot be the leader of this area. It is just too late, just the same as Blackberry and Nokia. Sometimes it is wise to know and accept your weak point. Concentrate in other fields like pc operating system, office, clouds, game consoles although in some areas you will also get problems from Google, Amazon, Apple and co. One thing you could try is to join Android development and build the best Android Apps for your products.

For us developers the history already told us many times: nobody wants to use Microsoft development tools and languages for serious enterprise development. Remember the story of copying Java with Microsoft Java (Visual J#)? Or do you want to use C# or Visual Basic doing your web app?

In this sense, Google will lose Sundar as an important person, who has done a lot of great stuffs with Chrome and other Google products. But Google will surely find another intelligent person who will love to work there to make all those products better and better. Don't forget: all the talented engineers are still there and they are the most important part of Google.

Update: 
Now we all know that Satya Nadella is the new CEO of Microsoft. If I were in his position I would do following things:
  • Follow Oracle's way. Oracle always supports Java since its beginning, although Oracle itself is not the creator of Java. The result of just following and embracing Java is amazing. Oracle becomes the best implementor and integrator in Java world (JRockit, JDeveloper and Java in Oracle Database to name some products). Oracle also manages to consolidate its technologies to support Java as the main path. The take over of Sun is just the logical way to continue supporting Java. Microsoft could follow this way. Support Java as the main language and runtime environment in all Windows operating system. Forget .NET, just integrate Java directly in Windows and become the best operating system for Java. Also expose all the API directly to Java world.
  • Follow Samsung's way. Samsung always support Android with Java since its beginning. Today Samsung becomes the best mobile and tablet device vendor which support Android. Microsoft could build Nokia with Android operating system and become the best Android mobile and tablet device vendor.
Doing both points will open the world of Microsoft, its Windows operating system and all those Nokia's and Xbox's devices. Everything will be based on JavaWrite Once Run Everywhere or more important: Learn Once Use Everywhere will be reality and Microsoft will get all those developers and supporters without limit to support Microsoft's New World.

In this sense, good luck to the new CEO of Microsoft Satya Nadella and don't forget that Open Systems will always win.

New Article at heise.de Developer: (Bi)Temporal Data Management in Java with Open Source Frameworks

If you ever need to handle (bi)temporal data in your Java app, check out my new article (in German language) at heise.de about: (Bi)Temporal Data, Implementation in Java with Open Source Frameworks: http://goo.gl/g8Lec4

Actually you always need to handle this topic in your Java business apps, a must read. All the examples can be found at Github: http://goo.gl/30Xy15

Have fun!

Wednesday, October 09, 2013

TigerTeam TRIMM - Model Driven Generator just went Open Source

TigerTeam TRIMM – Model Driven Generator is just Open Sourced in July 2013. I myself never heard of this product before and found this product by coincidence as I tried to find a JPA 2 standalone cartridge / generator available in Internet. I know that AndroMDA has some cartridges to persist Java objects but as far as I know AndroMDA only support persistence in the context of EJB 3 not as standalone JPA.

After I found this product I began to analyze the code at Bitbucket.org and found out that the idea with the Events and Listeners as Extensions is a very good idea. TRIMM also uses Maven completely and does not have any dependencies to Eclipse plugins, so you are Eclipse independent. Special to this framework is that it uses its own metamodel and does not use UML2 metamodel for the generators. TRIMM should work for MagicDraw and Enterprise Architect and it can read the files directly from those UML tools. It also uses its own Java code generator which is basically based on String outputs. A very good thing is that TRIMM offers some ready to use cartridges like Java, JPA 2 and Webservice.

It seems that TRIMM does not suffer the problems I mentioned in my older Java article at Java Lobby post and uses almost the same approach of KissMDA. Following are the differences I can see in a glance:

(1) TRIMM is using its own Java generator which is basically based on String outputs and this is not the best way to generate Java codes. Using Eclipse JDT is a lot more easier and you always get syntactically correct Java codes which can be easily unit tested like this KissMDA example.

(2) Interesting to compare the transformers for Java code generation. In TRIMM you have the class JavaGenerator.java. In KissMDA you use the class SimpleJavaTransformer.java. They both are the main entering point for the generation of Java codes. Both classes look quite similar from the way of doing the works they need to do. Using DI framework like Guice in KissMDA makes the code more readable and easier to understand.

(3) Unit tests are crucial in KissMDA. The simple Java cartridge has in the mean time about 35 test cases. This is possible because we use Mockito extensively. TRIMM Java cartridge has also some unit tests which are not many in comparison with the available implementation classes.

I really like the idea of using Events and Listeners to plug the extensions. In this context I will try to write the next KissMDA cartridge which will be a JPA 2 cartridge using Events, Event Bus and Listeners. As a foundation I will use this easy and simple Event Binder framework: GWT Event Binder. In the mean time I found some Open Source Event Bus implementations like Guava and MBassador. This is a good blog which compares those Event Bus frameworks: Java Event Bus Library Comparison. My choice is MBassador because it supports Weak References in Event Bus.

Congratulation to the team of TRIMM and I  think, it is a very good move to Open Source TRIMM. I wish all the best for the future of TRIMM!

Monday, July 01, 2013

CDI vs. Spring Framework Core

Another question I got in my Spring Framework training is: is it worthwhile to switch from Spring Framework 3.x to CDI with JBoss Weld or Apache OpenWebBeans implementation?

After googling around I found some good articles and blogs about this topic. These two articles are the best in my opinion:
Luckily I had a chance to take a CDI course intensively for two days to be able to see what we actually could do with CDI. After doing two days intensive course of CDI I can sum up this topic as following:
  • CDI takes a lot of good idea of Spring Framework, this is for sure.
  • Anything you can do with CDI (I used JBoss Weld in the course I followed) is not new and you can have those stuffs in Spring Framework as well.
  • In CDI 1.0 (JEE 6) there is still a lot of things which is not supported natively such as "scanning of classes and packages" to turn on the dependency injection. Sure you can use CDI portable extensions to support such a functionality but the problem with those extensions is just like Eclipse plugins: you never know whether all of the plugins can work smoothly together if they are not written by one person or one vendor.
Additionally to the articles and blogs above I found following extensions for Spring Framework to support following CDI mechanism:
(1) Events with @Observes: take a look at this Open Source project which extends Spring capability: spring-event-annotations.

(2) Decorator with @Decorator and @Delegate: take a look at this blog and Open Source project: JSR-299 CDI Decorators for Spring beans.

(3) Interceptor with @Interceptor, @InterceptorBinding, @AroundInvoke and @Nonbinding: take a look at this article: JSR-299 CDI Interceptors for Spring Beans.

(4) Custom bean scope: take a look at this article: Custom scopes in CDI 1.0 and Spring 3.1.

(5) Type-safe injection: take a look at this blog: Type-safe dependency injection in Spring 3 IoC-Container.

At the end I have to admit that you won't miss anything if you are using Spring Framework. It is definitely more mature than any other Context and Dependency Injection containers available on the market. So if you ask me whether it is worthwhile to switch from Spring Framework Core to CDI implementation like JBoss Weld I would say: no.There is also no economically reasons to switch to CDI implementation if you are already use Spring Framework. Try explain to your management that you want to switch to CDI just because it is standard, no other advantages and also with less functionality and maturity. Surely you should use the standard Annotations within Spring Framework, so instead of @Autowired you should use @Inject for example. In some ways I see Spring Framework Core as "the" reference implementation of CDI specification.

Cheers,
Lofi.

Tuesday, June 18, 2013

Creating Spring Bean dynamically in the Runtime

In my training someone asked me whether it is possible to create an object (a Spring Bean) dynamically so you can choose which implementation you want to have in the runtime. So at the compile time you don't know what object actually should be created yet. The application should decide what object to be created based on a property file.

1. We create an annotation so we can mark the method which should be able to create the object dynamically:

...
package your.package;
...
@Retention(RetentionPolicy.RUNTIME)
public @interface InjectDynamicObject {
}
...

2. Use the new created annotation in your method which should be able to create the object dynamically:

...
@Named("customerBo")
public class CustomerBoImpl implements CustomerBo {
...
    @Override
  @InjectDynamicObject
  public Customer getDynamicCustomer() {
        return this.dynamicCustomer;
}
...

3. Write an aspect with Pointcut and Advise which change the object returned by the method in the step 2:

...
@Named
@Aspect
public class DynamicObjectAspect {

    // This comes from the property file 
@Value("#{objects.object}")
private String object;

@Inject
private ApplicationContext applicationContext;

@Pointcut("execution(@your.package.InjectDynamicObject * *(..))")
public void beanAnnotatedWithInjectDynamicObject() {
}

@Around("beanAnnotatedWithInjectDynamicObject()")
public Object adviceBeanAnnotatedWithInjectDynamicObject(
ProceedingJoinPoint pjp) throws Throwable {
Object returnResult = pjp.proceed();
       // Create the bean or object depends on the property file
Object createdObject = applicationContext.getBean(object);
return createdObject;
}
}
...

4. Write your unit test to test the method:

...
    @Test
public void testCustomerOnlineOrOffline() {
// Dynamic object creation
System.out.println("DYNAMIC CUSTOMER: "
+ customerBo.getDynamicCustomer().getName());
}
...

OK, there is another easier way to do this ;-) Without Aspects and AspectJ, just pure Spring:

Just inject all your component implementations into a Map and get the correct implementation out of it. Just like what we have done in eXTra Client application. Please take a look at our implementation of PluginsLocatorManager as an example: http://goo.gl/itpcb. Spring injects the Map with Bean name as String and the Bean itself automagically. "... Even typed Maps can be autowired as long as the expected key type is String. The Map values will contain all beans of the expected type, and the keys will contain the corresponding bean names" (see Spring documentation for details).

...
@Named("customerBo")
public class CustomerBoImpl implements CustomerBo {
...
    // We inject the customer implementations into a Map
    @Inject
    private Map<String, Customer> customerDynamicMap;


    // This comes from the property file as a key for the Map
@Value("#{objects.object}")
private String object;
    
    @Override
  public Customer getDynamicCustomer() {
        return this.customerDynamicMap.get(object);
}
...

Have fun!
Lofi

Thursday, April 25, 2013

Software Development Macro and Micro Process

If you think that in year 2012 all companies which produce software and IT divisions in our world have already their optimized software development process, you are wrong. It seems that we - software architects, software developers or whatever your title is - still need to optimize the software development process in many software companies and IT divisions.

So what do you do if you enter a software company or IT division and you see following things:

1. There is a perfect project management process to handle all those development of software but it is a pure project management without a context to software development. So basically you only take care of cost, time, budget and quality factors. In the software development you still use the old fashioned waterfall process.

2. From the tooling point of view: you have a project management planning and controlling tool but you are still in the beginning of Wiki (almost no collaboration tool) and you don't use issues tracking system to handle all the issues for the development of your software components and applications. You use Winword and Excel to define your requirements and you cannot transform them to your software products since you don't have any isssues tracking system. No chance to have traceability from your requirements down to your issues to be done in your software components and applications.

3. Maven is already used but with a lot customization and not intuitively used. The idea of using a concrete already released version of dependencies was not implemented. Instead you always open all the dependently projects in Eclipse. You can imagine how slow Eclipse works since you need to open a lot of projects at once although you only work for one project. Versioning in Maven is also not used correctly e.g.: no SNAPSHOT for development versions.

4. As you work with webapp you always need to redeploy to the application server. No possibility to hot deploy the webapp. Use ctrl-s, see your changes and continue to work without new deployment is just a dream and not available.

Luckily as an experienced software architect and developer we know that we can optimize the two main software development processes:

1. Software Development Macro Process (SDMaP): this is the overall software development lifecycle. In this process model we define our requirements, we execute analysis, design, implementation, test and we deploy the software into production. Waterfall process model and agile process model like RUP and Scrum are examples of SDMaP.

2. Software Development Micro Process (SDMiP): this is the daily work of a software developer. How a software developer works to develop the software. A software developer codes, refactors, compiles, tests, runs, debugs, packages and deploys the software.

More information on SDMaP and SDMiP:
The picture below shows the SDMaP and SDMiP in combination. The macro (SDMaP) and micro (SDMiP) process meet at the implementation phase and activity. So changing and optimizing one has definitely side effects on the other one and vice versa.


At the example of organization mentioned above it is important that we optimize both processes since they work hand in hand. So how can the optimization for macro and micro process looks like?

1. SDMaP:
  • Introduce Wiki for IT divisions and software companies. You can use WikIT42 to make the structure of your Wiki and use Confluence as your Wiki platform.
  • Introduce Wiki with issue tracking like JIRA and combine both of them to track your requirements.
  • Refine the requirements into issues (features, tasks, bugs, etc.) to the level of the software components and applications, because at the end you will implement all the requirements using your software components and applications.
  • Introduce iterative software development lifecycle instead of waterfall process. This is a long way to go since you need to change the culture of the company and you need a full support from your management.
2. SDMiP
  • Update the Maven projects to use the standard Maven mechanism and best practices with no exception. Transform the structure of the old Maven to the new standard Maven using frameworks like MoveToMaven
  • Use Maven release plugin to standardize the release mechanism of all Maven projects.
  • Use m2e Eclipse plugin to optimize your daily work as a software developer under Eclipse and Maven.
  • Use Mylyn to integrate your issue tracking system like JIRA into your Eclipse IDE.
  • Introduce JRebel to be able to hot deploy quickly your webapps into the application server.
Optimizing macro and micro process for software development is not an easy task. In the macro process you need to handle all those relationships with other divisions like Business Requirements, Quality Assurance and Project Management divisions. You need to convince them that your SDMaP optimization is the best way to go. This is more an organizational challenge and changes than the micro process optimization.

The micro process is also not easy to optimize, since you need to convince all developers that they can be more productive with the new way of working than before. You need to show them that it is a lot more faster if you don't open a lot of Java projects within your Eclipse workspace. Also using JRebel to deploy your webapp to your application server is the best way to go. Normally developers are technical oriented, so if you can show them the cool things to make, they will join your way.