Acquiring new data skills for engineers during the COVID-19 lockdown
A crisis like COVID-19 upends our routines, and in doing so it forces us into new ways of thinking and doing. Sure, you may not always feel like creating the next Ruby on Rails, but you finally have that 30-minutes-to-spare to complete that course, develop that new skill - and level-up your knowledge.
Learning online can give your existing job a new lease of life or help you stand out for a new opportunity. This is especially powerful if you also put newly acquired knowledge into practice in your own projects.
But where to begin?
As a fast-growing start-up, we interview and hire many industry professionals that have supplemented their CVs with online learning. Because we're big supporters of a lifelong learning mentality, we constantly encourage our employees to continue investing in their development even after their onboarding training. We’ve even helped those coming from a non-technical background to gain expertise independently by using the right resources.
So what’s the priority on your to-do list? What skills and additions to your CV will make you stand out to recruiters or put you in good stead with your manager?
Here are some key pointers that will make a difference:
1. Gain Certifications
Don’t underestimate the power of proving your knowledge with a certification. But choose carefully: which certifications deserve your time? Which will bring the best results? Official cloud certifications, for example, are certainly worth the effort, as the demand for data engineers with cloud proficiency will only continue to grow as companies digitize and remote work becomes the new normal:
It might be time for you to become a Microsoft certified data engineer! Whether you do it on your own by watching detailed modules or you choose the instructor-led option, this course helps you unlock the potential of the Azure suite to satisfy business needs.
And what about the Google Cloud Platform Certification? The successful completion of this training will assess your ability to design, build and operationalize data processing systems to ensure security and scalability.
And if these certifications are a piece of cake for you, then go ahead and schedule your exam for the AWS Certified Big Data – Specialty Certification. This course is for experienced professionals that have used AWS technology for at least two years and will validate your skills on Big Data practices.
2. Explore more Big Data tools
You might have significant experience on several big data tools, but there are always more to discover. Perhaps Apache Spark sparks something in you, or Apache Cassandra database offers a new option for managing your data. Don’t forget Apache Hadoop (focusing on HDFS, HBase and Hive), or the powerful Apache Kafka.
How can you make the most of those tools? With our Docker, there are various open-source tools that you can easily download to tinker with in your own time. So, for example, if you are ready to dive into Apache Kafka, you can download Lenses Box, which is basically Kafka and Lenses inside a Docker image. Just like that, you have it all in one place! Trial your Kafka commands by sampling data, exploring Kafka Connect, playing around with Lenses SQL, starting to stream and more cool stuff.
3. Look into the theory, not just into tools
It is a universal truth: you can’t get your hands dirty solving problems and equations without reading some theory first. How would you know whether you’re solving the right problem, and how to frame it? The same applies to data science. Try getting familiar with Algorithms, data structures, or SQL. And don’t be scared of the math you will find in your path - just start with simple concepts, like averages or percentiles. Browse through Coursera or Udemy to find courses that focus on the terminology behind the execution.
4. Build your own projects
Then you can put your theory into practice. To be considered for a senior position, you would usually prove your skills by building your own projects. And if you are looking for public datasets to make your project reflect real world scenarios, Github’s Awesome Datasets provides a topic-centric list of HQ open datasets with several high-quality data of all the major industries and sectors to experiment with. Using realistic data enables you to solve real problems in a safe environment and will level-up your experience and understanding of real-world cases.
Cool stuff, huh? You are probably already prioritizing your new to-do list, but don’t forget to take it slow. This is a marathon, not a race. Your professional development is a lifelong investment and should be enjoyed at a sustainable pace that suits your situation and helps your skills stay with you.. Take time to process and gain a high level of understanding of everything you do, instead of trying to do everything at once. Multitasking is useful when responding to a changing environment, but if you’re deep diving into a topic, switching contexts can easily lead to becoming a jack-of-all-trades - and a master of none.
So now that you have the time management factor figured out, go on and start preparing for your next cloud certification, or exploring that Big Data tool you always wanted to master.