2018 Tech Trends

by | Technology

Sheesh, 2018 is right around the corner, where has the time gone?

Ok, maybe we’re still not quite in the age of the Jetsons, with flying cars, and regular shuttle rocket buses to the moon and beyond, but that doesn’t mean we’re not living in an age of amazing technology.

And since we’re heading into 2018, I thought it might be appropriate to talk about what kinds of technology trends will become the hot new technology for the new year.

  1. Augmented reality
  2. Big data
  3. Machine learning
  4. Functional programming
  5. Cloud technology
  6. Containerization

1. Augmented Reality

The concept of artificial reality technology embedded into tech devices has been around for many years. I remember my college years when virtual reality was all the rage and gaming companies released virtual reality headsets a computer gamer could wear, and experience a much more immersive video gaming experience. When you turned your head, the gaming world would pivot around you, based on your head movement so it felt like you were actually plopped into the game world you were playing.

The military has used this kind of immersive technology for combat air pilots for many years. Heads up display information about your airspeed, your position relative to the ground, and all sort of other real-time information a pilot needs, gets displayed directly into the inside of the pilot’s helmet, with the goal that the pilot never has to move his head down from the front forward cockpit windshield to see his instrument panel.

In combat situations where split-second decisions could be the difference between life and death, every bit of technology that gives the pilot an edge over their opponent can be the deciding factor for success or failure.

Things like helmets and goggles are still quite cumbersome to wear for most everyday consumer usage, yet the quest to embed virtual reality technology into consumer devices continues.

More recently, technology companies like Google, Microsoft, and Apple have created more user-friendly, and less intrusive wearable consumer devices that utilize VR technology.

Google introduced Google Glass, which is a head-mounted display embedded into the shape of regular eyeglasses. It displays real-time digital information about the environment you are looking at, directly into the lenses.

Microsoft introduced the Hololens device that goes a little further than Google Glass. Instead of displaying digital information to your lenses, it actually projects images outside of your hololens device and into the real world.

Apple recently released augmented reality into their latest iPhone devices. It embeds and blends real images and videos with digital information. For example, you can point your iPhone at some geographic landmark or location and Apple’s augmented reality technology will embed associated digital information about your specific landmark location into the real-time image you are looking at.

I think we’ve still only scratched the surface of virtual and augmented reality technology. The new year will usher in a whole new category of software apps on your mobile devices that will blend virtual/augmented reality technology directly into our apps to create more immersive end-user experiences.

I can already imagine some practical uses for embedding virtual and augmented reality experiences into mobile device applications.

Students could bring their smartphones and tablet devices to field trips to museums and other educational landmark locations. As they point their devices around at interesting locations and landmarks, information about the historical significance and other data could be presented visually and/or verbally from their devices.

Students living in Oregon could visit portions of the Oregon trail and find information about the kinds of wagons and horses used on the trail, the native plant and animal life indigenous to those areas, and many other educational facts, as they physically travel down those trail locations.

Sports fans could point their smartphones and tablet devices at players during their favorite sporting events and get real-time information and stats about each player, their strengths and weaknesses, their career stats, and all the other information sports junkies crave about their favorite athletes.

E-commerce and retail experiences could undergo a huge transformation with more immersive augmented reality apps.

You could walk into a retail store, point your smartphone at any store item, and get store reviews of those products from past consumers of the item.

Perhaps you want to know how an outfit would look on you, but you don’t want to go through the hassle of going into a fitting room and trying it out. An augmented reality app could scan your body, get your physical height, weight and other attributes and based on that information, display a virtual avatar of yourself in those clothes. It could also use machine learning algorithms to determine if your particular piece of clothing would go well with other clothing apparel you already own.

A woman could go into a cosmetics store, point their smartphone at a particular mascara, lipstick, or other item and use augmented reality technology to project how that cosmetic item would look on them without having to actually try it on.

It could also assess whether there are any chemicals or ingredients in the item to verify there is nothing in the ingredients that could cause any unwanted and harmful allergic reactions.

2. Big Data

I’ve built a career around developing software applications for many companies and organizations.

There are many approaches to building software, but I’ve noticed a pattern in the way I build full stack applications.

In the software programming world, you will often hear the phrase “Three-tiered application” or “Model-View-Controller”.

They’re fancy terms for saying that most modern applications are built into layers, much like a cake.

The top or outer layer of the cake which you can visually see corresponds to the UI (user interface) layer of a software application.

The inner layers of a cake correspond to the layers of a software application where all the custom business rules of an application live and where the actual data of an application gets stored.

For example, if you’re building a banking application, you will be incorporating a lot of business rules and logic into the business layer of an application. For example, when you withdraw funds from a bank account, you want to make sure the withdrawal amount cannot exceed the current bank account balance.

Those kinds of rules and logic get embedded into the middle tier or “controller” of a Model-View-Controller layered application.

The data resides in the Model of a Model-View-Controller application. This is where all the important data which the application needs to function get stored.

And while each of these layers is important, I’ve come to the conclusion that it’s the data layer of a software application, that represents the “crown jewels” of an application.

Think about a company in the business of selling vehicles.

An enterprise application will need to capture very important business segments of that car company.

The sales division of the company will be focused on selling as many vehicles to existing and new customers as possible. So the data which an enterprise application must capture for the sales division of the company will all revolve around the existing customers and new customer prospects, as well as the actual sales order and car inventory information.

The service department of a company will focus on servicing vehicles that need maintenance or repairs and will capture lots of information about repair orders, maintenance schedules, and inventory information about all the individual car parts needed for servicing a vehicle.

How does a company increase their customer base and sales?

They need to analyze previous sales and customer data and look for useful patterns in their sales history and demographic information about their customers.

This kind of valuable information is locked away in the data kept by an organization, and many organizations are realizing how important the collection of that data is to the continued success and growth of their companies.

I’m starting to see new kinds of technology jobs sprout up that address the need to analyze and sift through big collections of data. Titles like “data scientist”, or “big data analysts”.

I predict these kinds of jobs will only grow in significance in the future.

3. Machine Learning

Machine learning has always seemed to get a bad rap in Hollywood … the idea that given enough time, if a machine starts thinking like a human, it will begin devising ways to destroy mankind.

“Colossus: The Forbin Project” is a supercomputer in charge of America’s defense system, that grows so powerful and learns so much in so short a period of time, that it deems mankind too ineffective to take care of itself, so it decides to become the new machine god for mankind.

In “2001: A Space Odyssey”, my favorite supercomputer, the HAL 9000, attempts to kill every crew member of the Discovery spaceship, on its journey through the solar system.

And probably the penultimate computer going amok film series, “The Terminator”, is all about a sentient computer and killer robots, dedicated to destroying mankind at all costs.

Yet there are lots of real-world benevolent tasks that machine learning is already in charge of.

Google’s autonomous self-driving car technology utilizes machine learning. When Amazon and Netflix recommend new movies, books, and other purchasable items, it’s machine learning that is tirelessly assessing and making new purchasing recommendations for you.

Machine learning is essentially teaching a computer or machine to automatically learn new things and improve from experience, without being explicitly programmed to do so.

And I believe machine learning will grow even more in importance and establish itself in pretty much every industry and discipline that requires some sort of automation.

4. Functional Programming

Back when I began my professional software development career in the mid-1990s, object-oriented programming was becoming the new standard paradigm for software development.

A programmer broke down project requirements or problem statement into a set of objects. The goal of OOP was to reduce the complexity of programming large-scale applications by breaking down everything into smaller bit sized objects.

If a programmer needed to model a vehicle, you could break down individual vehicle components into human concepts like objects. For example, the vehicle’s engine could be modeled with one type of object. The wheels, another kind of object, and so on.

OOP is still very much in play today, but as previously mentioned before, what many companies and organizations are finding is that the true value of a software application isn’t the application itself, but the raw data that the application processes.

And functional programming languages make processing and querying large collections of datasets much easier to work with, hence the rise and popularity of functional programming.

And I think this trend of big data and functional programming to process this big data will become even more popular as the years go by.

5. Cloud Technology

Back in the old days of software development, companies wouldn’t dream of allowing other organizations to shepherd the applications and data of their IT organizations.

Back then, a company would control and maintain all their digital software inventory in-house, but doing so at the cost of huge maintenance and support costs.

What Amazon introduced with their Amazon Web Services, or AWS for short, is allow organizations to free themselves from the burden of maintaining and supporting their apps and data, and let Amazon take care of this for them.

And it’s caught on like wildfire with many organizations and given Amazon a huge new revenue stream in the process.

Microsoft is getting on the bandwagon with their own Azure services, and it’s safe to say that cloud computing and hosting is here to stay.

6. Containerization

It’s easy to take for granted just how much work and effort goes into building and running modern day software applications.

For web-based software applications, you need a web server to serve up all the web pages of the application, a database to house all the data needed by the application, perhaps a file server to house external files needed by the application … all these dependencies and components don’t just magically get created out of thin air.

You need dedicated folks who have the know-how and expertise to spin up each of these individual components.

Containerization simplifies a lot of this configuration complexity and encapsulates all these interdependent dependencies and components into a single, standalone executable bundle.

Your application can run in a nice virtualized and compact environment that is easy to replicate on new machines, thanks to container technology, and Docker seems to be at the forefront of container technology.

It makes the hassle and tedium of getting your software applications and systems to run in virtually any computer environment a snap, and the popularity of container technology will only continue to grow over time.

Conclusion

Ok, so 2018 probably won’t bring us flying cars and regular rocket shuttles to the moon.

But that doesn’t mean we’re not seeing formerly science fiction flights of fancy turn into hard reality.

I, for one, am still looking forward to what’s coming around the corner.

Ready for Your Next Job?

We can help! Send us your resume today.


Need Talent?

Submit your job order in seconds.


About ProFocus

ProFocus is an IT staffing and consulting company. We strive to connect a select few of the right technology professionals to the right jobs.

We get to know our clients and candidates in detail and only carefully introduce a small number of candidates that fit the role well.