Time has come to move on from Blogger. As times change, we change along with it. Hopefully the journey would be as fulfilling as it was with Blogger.
You have been good to me dear Blogger, but its time to try something new. Who knows, I might be back here if I ever break up with my new found blog.....
Until then good friend. KIT @ https://medium.com/@dinuka.roshan
My Journey Through IT
"In today's environment, hoarding knowledge ultimately erodes your power."
Tuesday, October 30, 2018
Tuesday, October 2, 2018
Moving through the spectrum
Often times, introspection is the best way to assess how you are progressing in your career. Even more important when you are in such a volatile and every changing industry as information technology.
How I see it is as people moving between different stages in a spectrum of the career. I remember when I started off, it was all about learning the latest and greatest technology and how I could integrate it to the project I was working on.
As time went by and the grey hair started to appear, it was more about finding a balance of using proven, stable technology whilst still assessing and phasing in the latest and greatest.
I tend to question things as I move along in my career. And one thing was whether it was all about the technology at the end of the day. My answer is it is not. Learning and incorporating new tech is just one part of the job.
Developing your people and soft skills is as important as keeping up with the technology trends. Jack of all trades is a nice adage to explain that succinctly. This does not mean that experts in specific fields are not important. It is all about finding the balance and moving out of your comfort zone if you so wish to.
People skills usually entail being empathetic and supporting of your team members and helping out where possible. No one likes heroes unless they come with a cape and cool super powers. You either win as a team or lose as a team and those are valuable lessons to be learned. My father once taught me a nice phrase which goes "Be careful on whose toes you step on, on your way up, because you do not know whom you would meet on your way down". For me, that just tell me that everyone should be respected equally from the janitor to the CEO as you maybe in the highest point in your life right now, but as we know it, the only constant in life is change and everything can change in a heart beat.
Soft skills become even more important as you progress in your career where its not about talking technology lingo, but more about conveying your ideas across to business people in a language they understand. This is not something which is inculcated in our industry in general. Lets be honest, most of us used to or still do hate talking to business people. At the end of the day, even the most cutting edge technology does not amount to much if no one is buying it. Soft skills are not only about presentations and speeches, but also about knowing how a business operates. No we do not need an MBA, yet, its good to know what a balance sheet is, how budgets are made, cost centres and profit centres etc so you would better understand the business lingo when needed to better convey your ideas across.
So keep moving along the spectrum of your career and make adjustments as you deem fit according to what you feel is right for you.
How I see it is as people moving between different stages in a spectrum of the career. I remember when I started off, it was all about learning the latest and greatest technology and how I could integrate it to the project I was working on.
As time went by and the grey hair started to appear, it was more about finding a balance of using proven, stable technology whilst still assessing and phasing in the latest and greatest.
I tend to question things as I move along in my career. And one thing was whether it was all about the technology at the end of the day. My answer is it is not. Learning and incorporating new tech is just one part of the job.
Developing your people and soft skills is as important as keeping up with the technology trends. Jack of all trades is a nice adage to explain that succinctly. This does not mean that experts in specific fields are not important. It is all about finding the balance and moving out of your comfort zone if you so wish to.
People skills usually entail being empathetic and supporting of your team members and helping out where possible. No one likes heroes unless they come with a cape and cool super powers. You either win as a team or lose as a team and those are valuable lessons to be learned. My father once taught me a nice phrase which goes "Be careful on whose toes you step on, on your way up, because you do not know whom you would meet on your way down". For me, that just tell me that everyone should be respected equally from the janitor to the CEO as you maybe in the highest point in your life right now, but as we know it, the only constant in life is change and everything can change in a heart beat.
Soft skills become even more important as you progress in your career where its not about talking technology lingo, but more about conveying your ideas across to business people in a language they understand. This is not something which is inculcated in our industry in general. Lets be honest, most of us used to or still do hate talking to business people. At the end of the day, even the most cutting edge technology does not amount to much if no one is buying it. Soft skills are not only about presentations and speeches, but also about knowing how a business operates. No we do not need an MBA, yet, its good to know what a balance sheet is, how budgets are made, cost centres and profit centres etc so you would better understand the business lingo when needed to better convey your ideas across.
So keep moving along the spectrum of your career and make adjustments as you deem fit according to what you feel is right for you.
Monday, August 13, 2018
Journey through the Amazon (Not the rainforest)
So I started my AWS journey two weeks back. After procrastinating for a few weeks I thought to bite the bullet and book an exam so that I would get myself to study for it. Looking at the certifications, I thought the AWS Certified Solutions Architect - Associate was a good choice to start off with. At the time of writing, the latest exam was the one updated on February 2018.
I passed the exam last Saturday(11th August 2018) with a score of 839 out of a possible 1000. The exam was 130 minutes in duration where you had to answer 65 questions.
I big shout out to acloudguru as they had one comprehensive course which I purchased on Udemy for a very economical price as there was a sale on that day on Udemy. Their online course was updated for the 2018 exam which helped a lot.
The online course was well paced and structured in a logical manner which helped as I went along the way. One point to mention is when Ryan(The instructor) tells you to read the FAQs in certain areas, take it very seriously. I cannot stress the importance of this. There were so many questions that came up which I could answer only because I read the FAQs on certain areas (S3, EC2, DynamoDB etc). I made a few flash cards as I went along which helped me summarise things to study close to the exam. Things like S3, EC2 and VPC is of utmost importance as a majority of questions are from these categories. Also, never skip the labs that are done. Do it along with the instructor because as the old adage goes;
One thing a friend of mine pointed out to me (thank you Udo) was that if you purchase the acloudguru course from Udemy, you can import it on the acloudguru platform by verifying you purchase. I would suggest you do this too since some of the updated lectures are updated on their own platform and it had better streaming than Udemy which sometimes would not even load for me. That lambda they use really work wonders :D
I prepared for approximately two weeks for the exam. Although I have to mention that I have worked with AWS before this as a Developer but mostly only related to EC2. Four days before the exam, I purchased a mock exam from AWS just to gauge whether I am ready for the exam. This had 30 exam questions and I must say this was a well spent purchase since some questions on the day of the exam was quite similar. One problem with the mock exam though is they do not mention what questions you got wrong, but just broke it down into different categories with a percentage so you know which area to focus on more.
Two days before the exam I always went through the flash cards I prepared every night and worked on doing the labs related to EC2, S3 and VPC. Specially VPC is very important since the exam sometimes tries to confuse you with certain things like security groups and network access lists.
As for the exam itself, although I cannot discuss the questions that came up as everyone signs an NDA before starting the exam, one thing I can say is some questions are tricky so you need to make sure to read the main keywords mentioned in the question. Some words like "availability" and "cost-effective" is important since the answer you select will change depending on whether you need cost effectiveness or high availability or both in some instances. 130 minutes is more than enough for the exam. I was done with 60 more minutes to spare.
As a final note, when you answer questions, I would suggest not to overthink it. Sometimes the most obvious answer is the right answer so don't look for a more complicated solution.
If anyone needs any help I would be glad to provide any assistance required and for anyone taking the exam, all the very best, I am sure you will knock it out of the park.
Next up, it will be the developer associate 2018 exam for me before heading over to the professional exam.
Thanks for reading and have a pleasant day everyone.
I passed the exam last Saturday(11th August 2018) with a score of 839 out of a possible 1000. The exam was 130 minutes in duration where you had to answer 65 questions.
I big shout out to acloudguru as they had one comprehensive course which I purchased on Udemy for a very economical price as there was a sale on that day on Udemy. Their online course was updated for the 2018 exam which helped a lot.
The online course was well paced and structured in a logical manner which helped as I went along the way. One point to mention is when Ryan(The instructor) tells you to read the FAQs in certain areas, take it very seriously. I cannot stress the importance of this. There were so many questions that came up which I could answer only because I read the FAQs on certain areas (S3, EC2, DynamoDB etc). I made a few flash cards as I went along which helped me summarise things to study close to the exam. Things like S3, EC2 and VPC is of utmost importance as a majority of questions are from these categories. Also, never skip the labs that are done. Do it along with the instructor because as the old adage goes;
I read and I forgot, I saw and I believed, I did and I understood
One thing a friend of mine pointed out to me (thank you Udo) was that if you purchase the acloudguru course from Udemy, you can import it on the acloudguru platform by verifying you purchase. I would suggest you do this too since some of the updated lectures are updated on their own platform and it had better streaming than Udemy which sometimes would not even load for me. That lambda they use really work wonders :D
I prepared for approximately two weeks for the exam. Although I have to mention that I have worked with AWS before this as a Developer but mostly only related to EC2. Four days before the exam, I purchased a mock exam from AWS just to gauge whether I am ready for the exam. This had 30 exam questions and I must say this was a well spent purchase since some questions on the day of the exam was quite similar. One problem with the mock exam though is they do not mention what questions you got wrong, but just broke it down into different categories with a percentage so you know which area to focus on more.
Two days before the exam I always went through the flash cards I prepared every night and worked on doing the labs related to EC2, S3 and VPC. Specially VPC is very important since the exam sometimes tries to confuse you with certain things like security groups and network access lists.
As for the exam itself, although I cannot discuss the questions that came up as everyone signs an NDA before starting the exam, one thing I can say is some questions are tricky so you need to make sure to read the main keywords mentioned in the question. Some words like "availability" and "cost-effective" is important since the answer you select will change depending on whether you need cost effectiveness or high availability or both in some instances. 130 minutes is more than enough for the exam. I was done with 60 more minutes to spare.
As a final note, when you answer questions, I would suggest not to overthink it. Sometimes the most obvious answer is the right answer so don't look for a more complicated solution.
If anyone needs any help I would be glad to provide any assistance required and for anyone taking the exam, all the very best, I am sure you will knock it out of the park.
Next up, it will be the developer associate 2018 exam for me before heading over to the professional exam.
Thanks for reading and have a pleasant day everyone.
Monday, July 31, 2017
Using Quartz for scheduling with MongoDB
I am sure most of us have used the Quartz library to handle scheduled activity within our projects. Although I have interacted with the library quite often in the past, it was the first time I had to use Quartz with MongoDB.
By default, Quartz only provides support for the traditional relational databases. Browsing through, I stumbled upon this github repository by Michael Klishin which provides a MongoDB implementation of the Quartz library in a clustered environment.
We will be using a Spring boot application to show you how we can integrate the Quartz library for scheduling in a clustered environment using MongoDB.
The GitHub repository with the code shown in this article can be found here.
All quartz related configuration is stored in a property file. The attributes we will be using are as follows;
Let us look at some of these properties. Others are self-explanatory with the comments provided.
Let us first see how our JobStore configuration class looks like;
In this above implementation, we have retrieved the MongoDB details pertaining to the active profile. If no profile is defined it defaults to the development profile. We have used the YamlPropertiesFactoryBean here to read off the application properties pertaining to different environments.
Moving on, we then need to let Spring manage the creation of the Quartz configuration using the SchedulerFactoryBean.
We define this as a Configuration class so that it will be picked up when we run the Spring boot application.
The call to setApplicationContextSchedulerContextKey method here is in order to get a reference to the Spring application context within our job class which is as follows;
As you can see, we get a reference to the application context when the SchedulerFactoryBean is initialised. The part of the Spring documentation I would like to draw you attention to is as follows;
Next up, we go on to configure the job to be run with the frequency by which to run the scheduled activity.
There are many ways you can configure your scheduler including cron configuration. For the purposes of this article, we will define a simple trigger to run every minute, three seconds after start up. We define this as a Configuration class so that it will be picked up when we run the Spring boot application.
That is about it. When you now run the Spring Boot application class found in the GitHub repository with a running MongoDB instance, you will see the following collections created;
By default, Quartz only provides support for the traditional relational databases. Browsing through, I stumbled upon this github repository by Michael Klishin which provides a MongoDB implementation of the Quartz library in a clustered environment.
We will be using a Spring boot application to show you how we can integrate the Quartz library for scheduling in a clustered environment using MongoDB.
The GitHub repository with the code shown in this article can be found here.
All quartz related configuration is stored in a property file. The attributes we will be using are as follows;
# ~~~~~~~~~~~~~~~~~~~~~~~~~~~ # Quartz Job Scheduling # ~~~~~~~~~~~~~~~~~~~~~~~~~~~ # Use the MongoDB store org.quartz.jobStore.class=com.quartz.mongo.intro.quartzintro.scheduler.CustomMongoQuartzSchedulerJobStore # --- # Note that all the mongo db configuration are set in the CustomMongoQuartzSchedulerJobStore.java class --- # MongoDB URI (optional if 'org.quartz.jobStore.addresses' is set) #org.quartz.jobStore.mongoUri=mongodb://localhost:27017 # Comma separated list of mongodb hosts/replica set seeds (optional if 'org.quartz.jobStore.mongoUri' is set) #org.quartz.jobStore.addresses=localhost # Will be used to create collections like quartz_jobs, quartz_triggers, quartz_calendars, quartz_locks org.quartz.jobStore.collectionPrefix=quartz_ # Thread count setting is ignored by the MongoDB store but Quartz requires it org.quartz.threadPool.threadCount=1 # Skip running a web request to determine if there is an updated version of Quartz available for download org.quartz.scheduler.skipUpdateCheck=true org.quartz.jobStore.isClustered=true #The instance ID will be auto generated by Quartz for all nodes running in a cluster. org.quartz.scheduler.instanceId=AUTO org.quartz.scheduler.instanceName=quartzMongoInstance
Let us look at some of these properties. Others are self-explanatory with the comments provided.
- org.quartz.jobStore.class : This defines the job store class which will handle storing the job related details in the database. By default, with the GitHub project mentioned before, we are provided with the MongoDBJobStore. For the purposes of this article however, we will extend the functionality provided by this class with our own implementation which will handle the MongoDB configuration based on Spring profiles.
- org.quartz.jobStore.mongoUri : You will define the comma separated MongoDB URI's here if you wanted to use the default MongoDBJobStore class. On this implementation however, since we are defining a custom job store, we will not be using this property. An example of how you would define this would be mongodb://<ip1>:<port>,<ip2>:<port>
- org.quartz.jobStore.collectionPrefix : This property defines the prefix for the collections created for the purposes of storing quartz specific details.
Let us first see how our JobStore configuration class looks like;
package com.quartz.mongo.intro.quartzintro.scheduler; import org.apache.commons.lang3.StringUtils; import org.quartz.impl.StdSchedulerFactory; import org.springframework.beans.factory.config.YamlPropertiesFactoryBean; import org.springframework.core.io.ClassPathResource; import com.novemberain.quartz.mongodb.MongoDBJobStore; import com.quartz.mongo.intro.quartzintro.constants.SchedulerConstants; import com.quartz.mongo.intro.quartzintro.constants.SystemProperties; /** * * <p> * We extend the {@link MongoDBJobStore} because we need to set the custom mongo * db parameters. Some of the configuration comes from system properties set via * docker and the others come via the application.yml files we have for each * environment. * </p> * * < These are set as part of initialization. This class is initialized by * {@link StdSchedulerFactory} and defined in the quartz.properties file. * * </p> * * @author dinuka * */ public class CustomMongoQuartzSchedulerJobStore extends MongoDBJobStore { private static String mongoAddresses; private static String userName; private static String password; private static String dbName; private static boolean isSSLEnabled; private static boolean isSSLInvalidHostnameAllowed; public CustomMongoQuartzSchedulerJobStore() { super(); initializeMongo(); setMongoUri("mongodb://" + mongoAddresses); setUsername(userName); setPassword(password); setDbName(dbName); setMongoOptionEnableSSL(isSSLEnabled); setMongoOptionSslInvalidHostNameAllowed(isSSLInvalidHostnameAllowed); } /** * <p> * This method will initialize the mongo instance required by the Quartz * scheduler. * * The use case here is that we have two profiles; * </p> * * <ul> * <li>Development</li> * <li>Production</li> * </ul> * * <p> * So when constructing the mongo instance to be used for the Quartz * scheduler, we need to read the various properties set within the system * to determine which would be appropriate depending on which spring profile * is active. * </p> * */ private static void initializeMongo() { /** * The use case here is that when we run our application, the property * spring.profiles.active is set as a system property during production. * But it will not be set in a development environment. */ String env = System.getProperty(SystemProperties.ENVIRONMENT); env = StringUtils.isNotBlank(env) ? env : "dev"; YamlPropertiesFactoryBean commonProperties = new YamlPropertiesFactoryBean(); commonProperties.setResources(new ClassPathResource("application.yml")); /** * The mongo DB user name and password are only password as command line * parameters in the production environment and for the development * environment it will be null which is why we use * StringUtils#trimToEmpty so we can pass empty strings for the user * name and password in the development environment since we do not have * authentication on the development environment.s */ userName = StringUtils.trimToEmpty(commonProperties.getObject().getProperty(SystemProperties.SERVER_NAME)); password = StringUtils.trimToEmpty(System.getProperty(SystemProperties.MONGO_PASSWORD)); dbName = commonProperties.getObject().getProperty(SchedulerConstants.QUARTZ_SCHEDULER_DB_NAME); YamlPropertiesFactoryBean environmentSpecificProperties = new YamlPropertiesFactoryBean(); userName = commonProperties.getObject().getProperty(SystemProperties.SERVER_NAME); switch (env) { case "prod": environmentSpecificProperties.setResources(new ClassPathResource("application-prod.yml")); /** * By deafult, in the production mongo instance, SSL is enabled and * SSL invalid host name allowed property is set. */ isSSLEnabled = true; isSSLInvalidHostnameAllowed = true; mongoAddresses = environmentSpecificProperties.getObject().getProperty(SystemProperties.MONGO_URI); break; case "dev": /** * For the development profile, we just read the mongo URI that is * set. */ environmentSpecificProperties.setResources(new ClassPathResource("application-dev.yml")); mongoAddresses = environmentSpecificProperties.getObject().getProperty(SystemProperties.MONGO_URI); break; } } }
In this above implementation, we have retrieved the MongoDB details pertaining to the active profile. If no profile is defined it defaults to the development profile. We have used the YamlPropertiesFactoryBean here to read off the application properties pertaining to different environments.
Moving on, we then need to let Spring manage the creation of the Quartz configuration using the SchedulerFactoryBean.
package com.quartz.mongo.intro.quartzintro.config; import org.springframework.context.annotation.Bean; import org.springframework.context.annotation.Configuration; import org.springframework.core.io.ClassPathResource; import org.springframework.scheduling.quartz.SchedulerFactoryBean; /** * This class will configure and setup quartz using the * {@link SchedulerFactoryBean} * * @author dinuka * */ @Configuration public class QuartzConfiguration { /** * Here we integrate quartz with Spring and let Spring manage initializing * quartz as a spring bean. * * @return an instance of {@link SchedulerFactoryBean} which will be managed * by spring. */ @Bean public SchedulerFactoryBean schedulerFactoryBean() { SchedulerFactoryBean scheduler = new SchedulerFactoryBean(); scheduler.setApplicationContextSchedulerContextKey("applicationContext"); scheduler.setConfigLocation(new ClassPathResource("quartz.properties")); scheduler.setWaitForJobsToCompleteOnShutdown(true); return scheduler; } }
We define this as a Configuration class so that it will be picked up when we run the Spring boot application.
The call to setApplicationContextSchedulerContextKey method here is in order to get a reference to the Spring application context within our job class which is as follows;
package com.quartz.mongo.intro.quartzintro.scheduler.jobs; import org.quartz.DisallowConcurrentExecution; import org.quartz.JobExecutionContext; import org.quartz.JobExecutionException; import org.quartz.PersistJobDataAfterExecution; import org.slf4j.Logger; import org.slf4j.LoggerFactory; import org.springframework.context.ApplicationContext; import org.springframework.core.env.Environment; import org.springframework.scheduling.quartz.QuartzJobBean; import org.springframework.scheduling.quartz.SchedulerFactoryBean; import com.quartz.mongo.intro.quartzintro.config.JobConfiguration; import com.quartz.mongo.intro.quartzintro.config.QuartzConfiguration; /** * * This is the job class that will be triggered based on the job configuration * defined in {@link JobConfiguration} * * @author dinuka * */ @PersistJobDataAfterExecution @DisallowConcurrentExecution public class SampleJob extends QuartzJobBean { private static Logger log = LoggerFactory.getLogger(SampleJob.class); private ApplicationContext applicationContext; /** * This method is called by Spring since we set the * {@link SchedulerFactoryBean#setApplicationContextSchedulerContextKey(String)} * in {@link QuartzConfiguration} * * @param applicationContext */ public void setApplicationContext(ApplicationContext applicationContext) { this.applicationContext = applicationContext; } /** * This is the method that will be executed each time the trigger is fired. */ @Override protected void executeInternal(JobExecutionContext context) throws JobExecutionException { log.info("This is the sample job, executed by {}", applicationContext.getBean(Environment.class)); } }
As you can see, we get a reference to the application context when the SchedulerFactoryBean is initialised. The part of the Spring documentation I would like to draw you attention to is as follows;
In case of a QuartzJobBean, the reference will be applied to the Jobinstance as bean property. An "applicationContext" attribute willcorrespond to a "setApplicationContext" method in that scenario.
Next up, we go on to configure the job to be run with the frequency by which to run the scheduled activity.
package com.quartz.mongo.intro.quartzintro.config; import static org.quartz.TriggerBuilder.newTrigger; import java.time.LocalDateTime; import java.time.ZoneId; import java.util.Date; import javax.annotation.PostConstruct; import org.quartz.JobDetail; import org.quartz.JobKey; import org.quartz.SimpleScheduleBuilder; import org.quartz.Trigger; import org.quartz.TriggerKey; import org.quartz.impl.JobDetailImpl; import org.springframework.beans.factory.annotation.Autowired; import org.springframework.context.annotation.Configuration; import org.springframework.scheduling.quartz.SchedulerFactoryBean; import com.quartz.mongo.intro.quartzintro.constants.SchedulerConstants; import com.quartz.mongo.intro.quartzintro.scheduler.jobs.SampleJob; /** * * This will configure the job to run within quartz. * * @author dinuka * */ @Configuration public class JobConfiguration { @Autowired private SchedulerFactoryBean schedulerFactoryBean; @PostConstruct private void initialize() throws Exception { schedulerFactoryBean.getScheduler().addJob(sampleJobDetail(), true, true); if (!schedulerFactoryBean.getScheduler().checkExists(new TriggerKey( SchedulerConstants.SAMPLE_JOB_POLLING_TRIGGER_KEY, SchedulerConstants.SAMPLE_JOB_POLLING_GROUP))) { schedulerFactoryBean.getScheduler().scheduleJob(sampleJobTrigger()); } } /** * <p> * The job is configured here where we provide the job class to be run on * each invocation. We give the job a name and a value so that we can * provide the trigger to it on our method {@link #sampleJobTrigger()} * </p> * * @return an instance of {@link JobDetail} */ private static JobDetail sampleJobDetail() { JobDetailImpl jobDetail = new JobDetailImpl(); jobDetail.setKey( new JobKey(SchedulerConstants.SAMPLE_JOB_POLLING_JOB_KEY, SchedulerConstants.SAMPLE_JOB_POLLING_GROUP)); jobDetail.setJobClass(SampleJob.class); jobDetail.setDurability(true); return jobDetail; } /** * <p> * This method will define the frequency with which we will be running the * scheduled job which in this instance is every minute three seconds after * the start up. * </p> * * @return an instance of {@link Trigger} */ private static Trigger sampleJobTrigger() { return newTrigger().forJob(sampleJobDetail()) .withIdentity(SchedulerConstants.SAMPLE_JOB_POLLING_TRIGGER_KEY, SchedulerConstants.SAMPLE_JOB_POLLING_GROUP) .withPriority(50).withSchedule(SimpleScheduleBuilder.repeatMinutelyForever()) .startAt(Date.from(LocalDateTime.now().plusSeconds(3).atZone(ZoneId.systemDefault()).toInstant())) .build(); } }
There are many ways you can configure your scheduler including cron configuration. For the purposes of this article, we will define a simple trigger to run every minute, three seconds after start up. We define this as a Configuration class so that it will be picked up when we run the Spring boot application.
That is about it. When you now run the Spring Boot application class found in the GitHub repository with a running MongoDB instance, you will see the following collections created;
- quartz_calendars
- quartz_jobs
- quartz_locks
- quartz_schedulers
- quartz_triggers
Thank you for reading and if there are any comments, improvements, suggestions, do kindly leave by a comment which is always appreciated.
Friday, July 28, 2017
Spring Boot with the Justice League
Dark times are ahead for the Justice League with the formidable Darkseid coming over to conquer human kind. Batman with the help of Wonder woman are on a quest to get the league together with one critical aspect missing. A proper Justice league member management system. As time is not on their side, they do not want to go through the cumbersome process of setting up a project from scratch with all the things they need. Batman hands over this daunting task of building a rapid system to his beloved trusted Alfred (As robin is so unpredictable) who tells Batman that he recalls coming across something called Spring Boot which helps set up everything you need so you can get to writing code for your application rather than being bogged down with minor nuances of setting up configuration for your project. And so he gets into it. Let's get onto it with our beloved Alfred who will utilize Spring Boot to build a Justice League member management system in no time. Well at least the back-end part for now since Batman like dealing directly with the REST APIs.
There are many convenient ways of setting up a Spring Boot application. For this article, we will focus on the traditional way of downloading the package (Spring CLI) and setting it up from scratch on Ubuntu. Spring also supports getting a project packaged on-line via their tool. You can download the latest stable release from here. For this post, I am using the 1.3.0.M1 release.
After extracting your downloaded archive, first off, set the following parameters on your profile;
Afterwards in your "bashrc" file, include the following;
What that last execution does is it gives you auto completion on the command line when you are dealing with the spring-cli to create your spring boot applications. Please remember to "source" both the profile and the "bashrc" files for the changes to take affect.
Our technology stack which is used in this article will be as follows;
This will generate a maven project with Spring MVC and Spring Data with an emebedded MongoDB.
By default, the spring-cli creates a project with the name set as "Demo". So we will need to rename the respective application class generated. If you checked out the source from my GitHub repository mentioned above then this will be done.
With Spring boot, running the application is as easy as running the jar file created by the project which essentially calls onto the application class annotated with @SpringBootApplication that boots up Spring. Let us see how that looks like;
We then move onto our domain classes where we use spring-data along with mongodb to define our data layer. The domain class is as follows;
With Spring-data came the functionality of defining your repositories easily that support the usual CRUD operations and some read operations straight out of the box without you having to write them. So we utilise the power of Spring-data repositories in our application as well and the repository class is as follows;
The usual saving operations are implemented by Spring at runtime through the use of proxies and we just have to define our domain class in our repository.
As you can see, we have only one method defined. With the @Query annotation, we are trying to find a super hero with the user of regular expressions. The options "i" denotes that we should ignore case when trying to find a match in mongo db.
Next up, we move onto implementing our logic to storing the new justice league members through our service layer.
Again quite trivial, if the member already exists, we throw out an error, else we add the member. Here you can see we are using the already implemented insert method of the spring data repository we just defined before.
Finally Alfred is ready to expose the new functionality he just developed via a REST API using Spring REST so that Batman can start sending in the details over HTTP as he is always travelling.
We expose our functionality as a JSON payload as Batman just cannot get enough of it although Alfred is a bit old school and prefer XML sometimes.
The old guy Alfred still wants to test out his functionality as TDD is just his style. So finally we look at the integration tests written up by Alfred to make sure the initial version of the Justice league management system is working as expected. Note that we are only showing the REST API tests here although Alfred has actually covered more which you can check out on the GitHub repo.
And that is about it. With the power of Spring boot, Alfred was able to get a bare minimum Justice league management system with a REST API exposed in no time. We will build upon this application in the time to come and see how Alfred comes up with getting this application deployed via docker to an Amazon AWS instance managed by Kubernetes in the time to come. Exciting times ahead so tune in.
There are many convenient ways of setting up a Spring Boot application. For this article, we will focus on the traditional way of downloading the package (Spring CLI) and setting it up from scratch on Ubuntu. Spring also supports getting a project packaged on-line via their tool. You can download the latest stable release from here. For this post, I am using the 1.3.0.M1 release.
After extracting your downloaded archive, first off, set the following parameters on your profile;
SPRING_BOOT_HOME=<extracted path>/spring-1.3.0.M1 PATH=$SPRING_BOOT_HOME/bin:$PATH
Afterwards in your "bashrc" file, include the following;
. <extracted-path>/spring-1.3.0.M1/shell-completion/bash/spring
What that last execution does is it gives you auto completion on the command line when you are dealing with the spring-cli to create your spring boot applications. Please remember to "source" both the profile and the "bashrc" files for the changes to take affect.
Our technology stack which is used in this article will be as follows;
- Spring REST
- Spring Data
- MongoDB
spring init -dweb,data-mongodb,flapdoodle-mongo --groupId com.justiceleague --artifactId justiceleaguemodule --build maven justiceleaguesystem
This will generate a maven project with Spring MVC and Spring Data with an emebedded MongoDB.
By default, the spring-cli creates a project with the name set as "Demo". So we will need to rename the respective application class generated. If you checked out the source from my GitHub repository mentioned above then this will be done.
With Spring boot, running the application is as easy as running the jar file created by the project which essentially calls onto the application class annotated with @SpringBootApplication that boots up Spring. Let us see how that looks like;
package com.justiceleague.justiceleaguemodule; import org.springframework.boot.SpringApplication; import org.springframework.boot.autoconfigure.SpringBootApplication; /** * The main spring boot application which will start up a web container and wire * up all the required beans. * * @author dinuka * */ @SpringBootApplication public class JusticeLeagueManagementApplication { public static void main(String[] args) { SpringApplication.run(JusticeLeagueManagementApplication.class, args); } }
We then move onto our domain classes where we use spring-data along with mongodb to define our data layer. The domain class is as follows;
package com.justiceleague.justiceleaguemodule.domain; import org.bson.types.ObjectId; import org.springframework.data.annotation.Id; import org.springframework.data.mongodb.core.index.Indexed; import org.springframework.data.mongodb.core.mapping.Document; /** * This class holds the details that will be stored about the justice league * members on MongoDB. * * @author dinuka * */ @Document(collection = "justiceLeagueMembers") public class JusticeLeagueMemberDetail { @Id private ObjectId id; @Indexed private String name; private String superPower; private String location; public JusticeLeagueMemberDetail(String name, String superPower, String location) { this.name = name; this.superPower = superPower; this.location = location; } public String getId() { return id.toString(); } public void setId(String id) { this.id = new ObjectId(id); } public String getName() { return name; } public void setName(String name) { this.name = name; } public String getSuperPower() { return superPower; } public void setSuperPower(String superPower) { this.superPower = superPower; } public String getLocation() { return location; } public void setLocation(String location) { this.location = location; } }As we are using spring-data, it is fairly intuitive, specially if you are coming from a JPA/Hibernate background. The annotations are very similar. The only new thing would be the @Document annotation which denotes the name of the collection in our mongo database. We also have an index defined on the name of the super hero since more queries will revolve around searching by the name.
With Spring-data came the functionality of defining your repositories easily that support the usual CRUD operations and some read operations straight out of the box without you having to write them. So we utilise the power of Spring-data repositories in our application as well and the repository class is as follows;
package com.justiceleague.justiceleaguemodule.dao; import org.springframework.data.mongodb.repository.MongoRepository; import org.springframework.data.mongodb.repository.Query; import com.justiceleague.justiceleaguemodule.domain.JusticeLeagueMemberDetail; public interface JusticeLeagueRepository extends MongoRepository<JusticeLeagueMemberDetail, String> { /** * This method will retrieve the justice league member details pertaining to * the name passed in. * * @param superHeroName * the name of the justice league member to search and retrieve. * @return an instance of {@link JusticeLeagueMemberDetail} with the member * details. */ @Query("{ 'name' : {$regex: ?0, $options: 'i' }}") JusticeLeagueMemberDetail findBySuperHeroName(final String superHeroName); }
The usual saving operations are implemented by Spring at runtime through the use of proxies and we just have to define our domain class in our repository.
As you can see, we have only one method defined. With the @Query annotation, we are trying to find a super hero with the user of regular expressions. The options "i" denotes that we should ignore case when trying to find a match in mongo db.
Next up, we move onto implementing our logic to storing the new justice league members through our service layer.
package com.justiceleague.justiceleaguemodule.service.impl; import org.springframework.beans.factory.annotation.Autowired; import org.springframework.stereotype.Service; import com.justiceleague.justiceleaguemodule.constants.MessageConstants.ErrorMessages; import com.justiceleague.justiceleaguemodule.dao.JusticeLeagueRepository; import com.justiceleague.justiceleaguemodule.domain.JusticeLeagueMemberDetail; import com.justiceleague.justiceleaguemodule.exception.JusticeLeagueManagementException; import com.justiceleague.justiceleaguemodule.service.JusticeLeagueMemberService; import com.justiceleague.justiceleaguemodule.web.dto.JusticeLeagueMemberDTO; import com.justiceleague.justiceleaguemodule.web.transformer.DTOToDomainTransformer; /** * This service class implements the {@link JusticeLeagueMemberService} to * provide the functionality required for the justice league system. * * @author dinuka * */ @Service public class JusticeLeagueMemberServiceImpl implements JusticeLeagueMemberService { @Autowired private JusticeLeagueRepository justiceLeagueRepo; /** * {@inheritDoc} */ public void addMember(JusticeLeagueMemberDTO justiceLeagueMember) { JusticeLeagueMemberDetail dbMember = justiceLeagueRepo.findBySuperHeroName(justiceLeagueMember.getName()); if (dbMember != null) { throw new JusticeLeagueManagementException(ErrorMessages.MEMBER_ALREDY_EXISTS); } JusticeLeagueMemberDetail memberToPersist = DTOToDomainTransformer.transform(justiceLeagueMember); justiceLeagueRepo.insert(memberToPersist); } }
Again quite trivial, if the member already exists, we throw out an error, else we add the member. Here you can see we are using the already implemented insert method of the spring data repository we just defined before.
Finally Alfred is ready to expose the new functionality he just developed via a REST API using Spring REST so that Batman can start sending in the details over HTTP as he is always travelling.
package com.justiceleague.justiceleaguemodule.web.rest.controller; import javax.validation.Valid; import org.springframework.beans.factory.annotation.Autowired; import org.springframework.http.HttpStatus; import org.springframework.http.MediaType; import org.springframework.web.bind.annotation.RequestBody; import org.springframework.web.bind.annotation.RequestMapping; import org.springframework.web.bind.annotation.RequestMethod; import org.springframework.web.bind.annotation.ResponseBody; import org.springframework.web.bind.annotation.ResponseStatus; import org.springframework.web.bind.annotation.RestController; import com.justiceleague.justiceleaguemodule.constants.MessageConstants; import com.justiceleague.justiceleaguemodule.service.JusticeLeagueMemberService; import com.justiceleague.justiceleaguemodule.web.dto.JusticeLeagueMemberDTO; import com.justiceleague.justiceleaguemodule.web.dto.ResponseDTO; /** * This class exposes the REST API for the system. * * @author dinuka * */ @RestController @RequestMapping("/justiceleague") public class JusticeLeagueManagementController { @Autowired private JusticeLeagueMemberService memberService; /** * This method will be used to add justice league members to the system. * * @param justiceLeagueMember * the justice league member to add. * @return an instance of {@link ResponseDTO} which will notify whether * adding the member was successful. */ @ResponseBody @ResponseStatus(value = HttpStatus.CREATED) @RequestMapping(method = RequestMethod.POST, path = "/addMember", produces = { MediaType.APPLICATION_JSON_VALUE }, consumes = { MediaType.APPLICATION_JSON_VALUE }) public ResponseDTO addJusticeLeagueMember(@Valid @RequestBody JusticeLeagueMemberDTO justiceLeagueMember) { ResponseDTO responseDTO = new ResponseDTO(ResponseDTO.Status.SUCCESS, MessageConstants.MEMBER_ADDED_SUCCESSFULLY); try { memberService.addMember(justiceLeagueMember); } catch (Exception e) { responseDTO.setStatus(ResponseDTO.Status.FAIL); responseDTO.setMessage(e.getMessage()); } return responseDTO; } }
We expose our functionality as a JSON payload as Batman just cannot get enough of it although Alfred is a bit old school and prefer XML sometimes.
The old guy Alfred still wants to test out his functionality as TDD is just his style. So finally we look at the integration tests written up by Alfred to make sure the initial version of the Justice league management system is working as expected. Note that we are only showing the REST API tests here although Alfred has actually covered more which you can check out on the GitHub repo.
package com.justiceleague.justiceleaguemodule.test.util; import java.io.IOException; import java.net.UnknownHostException; import org.junit.After; import org.junit.AfterClass; import org.junit.Before; import org.junit.BeforeClass; import org.junit.runner.RunWith; import org.springframework.beans.factory.annotation.Autowired; import org.springframework.boot.test.autoconfigure.web.servlet.AutoConfigureMockMvc; import org.springframework.boot.test.context.SpringBootTest; import org.springframework.data.mongodb.core.MongoTemplate; import org.springframework.test.context.junit4.SpringRunner; import org.springframework.test.web.servlet.MockMvc; import com.fasterxml.jackson.databind.ObjectMapper; import com.justiceleague.justiceleaguemodule.domain.JusticeLeagueMemberDetail; import de.flapdoodle.embed.mongo.MongodExecutable; import de.flapdoodle.embed.mongo.MongodStarter; import de.flapdoodle.embed.mongo.config.IMongodConfig; import de.flapdoodle.embed.mongo.config.MongodConfigBuilder; import de.flapdoodle.embed.mongo.config.Net; import de.flapdoodle.embed.mongo.distribution.Version; /** * This class will have functionality required when running integration tests so * that invidivual classes do not need to implement the same functionality. * * @author dinuka * */ @RunWith(SpringRunner.class) @SpringBootTest @AutoConfigureMockMvc public abstract class BaseIntegrationTest { @Autowired protected MockMvc mockMvc; protected ObjectMapper mapper; private static MongodExecutable mongodExecutable; @Autowired protected MongoTemplate mongoTemplate; @Before public void setUp() { mapper = new ObjectMapper(); } @After public void after() { mongoTemplate.dropCollection(JusticeLeagueMemberDetail.class); } /** * Here we are setting up an embedded mongodb instance to run with our * integration tests. * * @throws UnknownHostException * @throws IOException */ @BeforeClass public static void beforeClass() throws UnknownHostException, IOException { MongodStarter starter = MongodStarter.getDefaultInstance(); IMongodConfig mongoConfig = new MongodConfigBuilder().version(Version.Main.PRODUCTION) .net(new Net(27017, false)).build(); mongodExecutable = starter.prepare(mongoConfig); try { mongodExecutable.start(); } catch (Exception e) { closeMongoExecutable(); } } @AfterClass public static void afterClass() { closeMongoExecutable(); } private static void closeMongoExecutable() { if (mongodExecutable != null) { mongodExecutable.stop(); } } }
package com.justiceleague.justiceleaguemodule.web.rest.controller; import org.hamcrest.beans.SamePropertyValuesAs; import org.junit.Assert; import org.junit.Test; import org.springframework.http.MediaType; import org.springframework.test.web.servlet.request.MockMvcRequestBuilders; import org.springframework.test.web.servlet.result.MockMvcResultMatchers; import com.justiceleague.justiceleaguemodule.constants.MessageConstants; import com.justiceleague.justiceleaguemodule.constants.MessageConstants.ErrorMessages; import com.justiceleague.justiceleaguemodule.domain.JusticeLeagueMemberDetail; import com.justiceleague.justiceleaguemodule.test.util.BaseIntegrationTest; import com.justiceleague.justiceleaguemodule.web.dto.JusticeLeagueMemberDTO; import com.justiceleague.justiceleaguemodule.web.dto.ResponseDTO; import com.justiceleague.justiceleaguemodule.web.dto.ResponseDTO.Status; /** * This class will test out the REST controller layer implemented by * {@link JusticeLeagueManagementController} * * @author dinuka * */ public class JusticeLeagueManagementControllerTest extends BaseIntegrationTest { /** * This method will test if the justice league member is added successfully * when valid details are passed in. * * @throws Exception */ @Test public void testAddJusticeLeagueMember() throws Exception { JusticeLeagueMemberDTO flash = new JusticeLeagueMemberDTO("Barry Allen", "super speed", "Central City"); String jsonContent = mapper.writeValueAsString(flash); String response = mockMvc .perform(MockMvcRequestBuilders.post("/justiceleague/addMember").accept(MediaType.APPLICATION_JSON) .contentType(MediaType.APPLICATION_JSON).content(jsonContent)) .andExpect(MockMvcResultMatchers.status().isCreated()).andReturn().getResponse().getContentAsString(); ResponseDTO expected = new ResponseDTO(Status.SUCCESS, MessageConstants.MEMBER_ADDED_SUCCESSFULLY); ResponseDTO receivedResponse = mapper.readValue(response, ResponseDTO.class); Assert.assertThat(receivedResponse, SamePropertyValuesAs.samePropertyValuesAs(expected)); } /** * This method will test if an appropriate failure response is given when * the member being added already exists within the system. * * @throws Exception */ @Test public void testAddJusticeLeagueMemberWhenMemberAlreadyExists() throws Exception { JusticeLeagueMemberDetail flashDetail = new JusticeLeagueMemberDetail("Barry Allen", "super speed", "Central City"); mongoTemplate.save(flashDetail); JusticeLeagueMemberDTO flash = new JusticeLeagueMemberDTO("Barry Allen", "super speed", "Central City"); String jsonContent = mapper.writeValueAsString(flash); String response = mockMvc .perform(MockMvcRequestBuilders.post("/justiceleague/addMember").accept(MediaType.APPLICATION_JSON) .contentType(MediaType.APPLICATION_JSON).content(jsonContent)) .andExpect(MockMvcResultMatchers.status().isCreated()).andReturn().getResponse().getContentAsString(); ResponseDTO expected = new ResponseDTO(Status.FAIL, ErrorMessages.MEMBER_ALREDY_EXISTS); ResponseDTO receivedResponse = mapper.readValue(response, ResponseDTO.class); Assert.assertThat(receivedResponse, SamePropertyValuesAs.samePropertyValuesAs(expected)); } /** * This method will test if a valid client error is given if the data * required are not passed within the JSON request payload which in this * case is the super hero name. * * @throws Exception */ @Test public void testAddJusticeLeagueMemberWhenNameNotPassedIn() throws Exception { // The super hero name is passed in as null here to see whether the // validation error handling kicks in. JusticeLeagueMemberDTO flash = new JusticeLeagueMemberDTO(null, "super speed", "Central City"); String jsonContent = mapper.writeValueAsString(flash); mockMvc.perform(MockMvcRequestBuilders.post("/justiceleague/addMember").accept(MediaType.APPLICATION_JSON) .contentType(MediaType.APPLICATION_JSON).content(jsonContent)) .andExpect(MockMvcResultMatchers.status().is4xxClientError()); } }
And that is about it. With the power of Spring boot, Alfred was able to get a bare minimum Justice league management system with a REST API exposed in no time. We will build upon this application in the time to come and see how Alfred comes up with getting this application deployed via docker to an Amazon AWS instance managed by Kubernetes in the time to come. Exciting times ahead so tune in.
Thursday, January 5, 2017
Bidding Adieu To My South African Family
It is 6pm on the first day of 2017 and I am here on my laptop writing this one final goodbye letter with a heavy heart to one of the most amazing teams I have had the privilege of working with. I am going to take this to a personal level and mention each individual person on the team and how they have impacted my life at a person and a professional level.
First off, a little bit about the journey to South Africa. It was the year 2014 when I was first began working on the MTN (The second largest telecommunications prodder in South Africa) South Africa project as a contractor via CSG (Cable Service Group) International through my company in Sri Lanka, Virtusa Polaris. It was a challenging few months as we worked tirelessly to get through the knowledge transfer sessions successfully on the systems we were taking over. Some valuable lessons learned here which I would take forward with me for life.
South Africa then became my second home on the 11th of July 2015 when I finally arrived permanently in South Africa to work on the MTN project. I am not going to bore you with the details of the work as this is not about the work but about the people I am going to be saying goodbye to. So let us take this show on the road shall we?
South Africa then became my second home on the 11th of July 2015 when I finally arrived permanently in South Africa to work on the MTN project. I am not going to bore you with the details of the work as this is not about the work but about the people I am going to be saying goodbye to. So let us take this show on the road shall we?
Starting off with the person who was the reason I got the opportunity to travel to South Africa, Peter Hebden (a.k.a Pete). Pete is the lead architect for the MTN engagement. I first spoke to Pete during the initiation of the project while I was in Sri Lanka. He and I got off to a great start from that first call we ever had. A very gregarious person by nature which made it an absolute pleasure to work with him. When it comes to work, he is 100% committed and there is no slipping pass mediocre work with Pete. He expects a level of quality from his team and nothing less will get you his approval. This was just fabulous as I now had someone who was passionate about quality as I was which meant I had to be on my “A” game always if I were to get his approval on the work we carried out as part of the MTN engagement. Pete comes from a civil engineering background which I must say kept us on our toes at times. One specific moment which I remember like it was yesterday. My team and I were working on the architecture and detailed technical design documents for the systems we were taking over. When the time came to review those documents with Pete, I remember him asking, “Why are those boxes in the diagrams not aligned and of different sizes?”. Sadly we did not have a plausible answer to provide and we got back to working on those diagrams until they were perfect. That moment right there, I realized how important even the minuscule details are to the overall success of the project. Integrity and honesty are two traits that Pete expects from each member of his team and he is the kind of person who will go out his way to help out anyone on his team even if the consequences are detrimental to his own career. That I must say is the kind of leader I admire and respect. He personally stood up for me when I had a few issues along the way, at moments when I was flustered and down. Although you will not see us giving high fives around at office, I consider Pete to be a very good friend rather than a boss I report to on a daily basis. He was always there for me professionally and personally. Showing me around South Africa and helping me out on apartment hunting, inviting myself and my wife to his house for Christmas are just a few moments I would like to mention. It was not something he had to do though he was thoughtful enough to do all of those things. For me, Pete is the epitome of a great leader.
Moving on, the next person I know I am going to miss dearly is my partner in crime, 006(long story to this name, which I will skip for now) and my sister from another mother, Nkateko Makhuvele(a.k.a Kat). Kat currently serves as a business analyst for the MTN engagement as part of CSGI. Oh my, where do I even start to describe this beautiful soul. I first met Kat when I arrived in South Africa in July 2015. She and I just hit it off from the first day we ever met. Became even better friends with my wife. A very religious and God fearing person who is always lending a helpful hand to the poor and the needy. My dance partner for our year end functions where we would simply bring the house down ;). You will never see her being acrimonious to anyone even if you caught her on her worst day. Always has a smile on her face and was and still is there for me whenever I needed her. Took me and my wife on our very first game drive in South Africa. I see Kat as a very strong minded person who is independent, career driven, kind hearted and just a pure blessing to this world we live in. I will miss you so dearly Kat, though this is surely not a goodbye as I am sure our paths will cross one day.
My geek counter-part Yeshkal Nanhoo(a.k.a Yesh). He serves as a Solutions Architect as part of CSGI for the MTN engagement. When you first see him, the thought that comes to your mind is “You should not pick a fight with this guy”. But when you actually get to know him, that statement just invalidates it by itself. He is such a kind and good hearted guy and you will never see him being hostile to anyone no matter how much people get on his nerves. A very calm and collected person. Both of us are DC comics fanboys and that was the common ground from which we built our friendship. The only other person who loved the Batman vs Superman: Dawn of Justice just as much as I did for the artistic value in that masterpiece. Hacking is his passion and you can see his eyes light up when he is presented with a new challenge which most times he would have resolved with a few days. Professionally, he is a person who is always approachable. Even if he is inundated with work, he will just stop what he is doing and help you out. I will surely miss our fruitful conversations about the multi-verse J. Thank you Yesh for being such an amazing friend and lending a helping hand whenever I needed. I am sure you will do even greater things in the time to come.
Rupin Mehta(a.k.a Rupz) is the equivalent of the “road runner” J. Serves as a Solutions Architect for CSGI as part of the MTN engagement. An Architect by day and a professional marathon runner by well early morning. Persistence is something that I admire about Rupin. If he sets his heart on something, he will work towards that through the obstacles. Can you imagine that this guy ran almost 160km? And this too while being a father of two daughters. Next time you have a reason to back out of achieving your goals, remember this guy. A very calm person in nature and never takes anything said to him personally. Willing to help anyone who needs his assistance any time you approach him. Was an absolute pleasure working with you Rupin and I am sure you will achieve even bigger and better things in the future.
Gareth Hall, the youngest lad and the only Jew in the team. Serves as a business analyst for CSGI as part of the MTN engagement. I must say this guy is quite sharp and is one of the best performers in the CSGI team. The go to guy when issues arise. His approach to solving problems is impeccable. Another very gregarious person who is always approachable. Being the youngest in the team is no barrier to this guy as he effortlessly leads off-shore teams with amazing results. The passion to learn is one very admirable and commendable quality. You will see him tirelessly work with people who need his help until the issue is resolved. For his age, what he has achieved is simply amazing and I am sure Gareth, that you will reach the pinnacle of your career in the time to come with ease. Oh and congratulations once again on the engagement and I wish you both a blessed wedded life ahead of you,
Justin Serra, the football fanatic. Served as the test manager as part of CSGI for the MTN engagement before he left us for greener pastures J. Justin and I became the friends right after our very first and last heated argument with regards to a change my team had just done. What I love about Justin is his relentless tenacity to always achieve the best. He will never compromise quality of a deliverable even if the Devil himself ordered it. That in essence raised the quality of each deliverable which in the end pleased the client. Always there to crack a joke during difficult times to boost the morale of the team. Building relationships was more important to him than simply just getting the work done and that inadvertently gained the respect and the trust from his team members. Thank you Justin for everything and I wish you nothing but the best in the time to come.
A few others that I wanted to mention but did not in detail just to maintain the brevity of this goodbye are Simon Dobbin, Renita Govendar, Chris Wakeman, Tony Ballard, Maesi Mpeko, Tristan Hannaford, Keressa Jeevarathanam, Ridwaan Catterall, Itumeleng Ntshoe, Mawabo Nkewana and Hugo Meyer. Thank you all of you for the immense support and guidance provided during my stay here in South Africa.
As I leave this wonderful set of people, will always cherish the amazing moments we all shared and if I ever annoyed, irritated or even offended you in anyway unintentionally, please accept my sincere apologies and wish all of you the very best with God’s blessings being showered upon you all always.
Although my stint at CSGI has come to an end, I am sure we will remain friends, quoting Buzz lightyear from Toy Story “till infinity and beyond”.
The fondest memories that I shared with each and every one of you are the moments that I will cherish for the rest of my life and until we meet again this is Dinuka (a.ka. Dinu) signing off from CSGI.
Saturday, December 3, 2016
Is it just about being the best coder?
Having worked in the software development industry for nearly a decade, I wanted to take a step back and look back on the journey so far. When I initially began my career, for me, it was about getting on board with the latest technological trends, learning new things I was interested in, experimenting with it and just learning everything I can about the programming languages I was fascinated about. This was very interesting stuff for a young lad just out of University and I loved every moment of it. I am still an enthusiastic technical geek and would never stop learning as it is not just a career but a passion.
Pondering on the fact of whether it is just about being the best coder you can be, I have to say no. Being good at what you do is just the beginning. One of the important skills to build up when you progress in your career are your soft skills. That will encompass you reading, writing and very importantly, speaking skills. As you progress in your career, it is of paramount importance that you learn the art of communicating effectively with your peers/clients. Some of the best coders I have met during my career struggle when it comes to expressing their thoughts of what they are working on to the outside world only because of the fact that they really did not give much thought to improving their soft skills.
My father always said, if you want to improve your language, make reading a daily habit. This was something that was inculcated in me and my sister from our younger days. I remember my father handing me a copy of the “reader’s digest” one day. To be quite honest, I initially read just the “Laughter is the best medicine” and “All in a day’s work” sections because that was where the humor was. As the days went by, reading was like a daily routine in my life. I always made it a point to keep a dictionary with me (digital of course when the smart-phone era began) so when I came across a word I did not know, I stopped and learned it. Then I would find a way of remembering it by using it in an appropriate context.
Financial literacy is another important skill to possess as you progress in your career. Your software development career will make much more sense if you understood how your work contributes to the bottom line of your company. I am no expert in the financial domain, yet there are a few books out there that will help you understand the fundamentals that you need. “The Ten-Day MBA” is a very clear and concise book that explains the points in a straightforward manner.
Although most of the time the work of a developer is done in isolation, it is always best to be a bit more gregarious and maintain a personal relationship with your colleagues. Learning more about the people you work with will enable you to understand them better which in turn will help you maintain a better work relationship. I have the pleasure of working with an amazingly astute team currently. Having built a personal relationship with each one of them have enabled me to work better with them as they trust me on that personal level and we work together as one due to that relationship. There is no blame game played. If we fail, we fail as one. That is a strong bond to hold. Even after you leave a company, these relationships will remain.
In ending this short article, I would like to say these are just my personal opinions and I am sure peopl will have their own interpretations of the same and would definitely love to hear your views on the same. No matter where you are in your career, always remember the following quote as you progress.
Pondering on the fact of whether it is just about being the best coder you can be, I have to say no. Being good at what you do is just the beginning. One of the important skills to build up when you progress in your career are your soft skills. That will encompass you reading, writing and very importantly, speaking skills. As you progress in your career, it is of paramount importance that you learn the art of communicating effectively with your peers/clients. Some of the best coders I have met during my career struggle when it comes to expressing their thoughts of what they are working on to the outside world only because of the fact that they really did not give much thought to improving their soft skills.
My father always said, if you want to improve your language, make reading a daily habit. This was something that was inculcated in me and my sister from our younger days. I remember my father handing me a copy of the “reader’s digest” one day. To be quite honest, I initially read just the “Laughter is the best medicine” and “All in a day’s work” sections because that was where the humor was. As the days went by, reading was like a daily routine in my life. I always made it a point to keep a dictionary with me (digital of course when the smart-phone era began) so when I came across a word I did not know, I stopped and learned it. Then I would find a way of remembering it by using it in an appropriate context.
Financial literacy is another important skill to possess as you progress in your career. Your software development career will make much more sense if you understood how your work contributes to the bottom line of your company. I am no expert in the financial domain, yet there are a few books out there that will help you understand the fundamentals that you need. “The Ten-Day MBA” is a very clear and concise book that explains the points in a straightforward manner.
Although most of the time the work of a developer is done in isolation, it is always best to be a bit more gregarious and maintain a personal relationship with your colleagues. Learning more about the people you work with will enable you to understand them better which in turn will help you maintain a better work relationship. I have the pleasure of working with an amazingly astute team currently. Having built a personal relationship with each one of them have enabled me to work better with them as they trust me on that personal level and we work together as one due to that relationship. There is no blame game played. If we fail, we fail as one. That is a strong bond to hold. Even after you leave a company, these relationships will remain.
In ending this short article, I would like to say these are just my personal opinions and I am sure peopl will have their own interpretations of the same and would definitely love to hear your views on the same. No matter where you are in your career, always remember the following quote as you progress.
Subscribe to:
Posts (Atom)