smartShift Technologies Achieves AWS Migration Competency Status

smartShift Technologies has been awarded Amazon Web Services (AWS) Migration Competency Status, a certification that highlights our capabilities of moving enterprise workloads to the cloud. AWS is a leader in the public cloud space, and provides infrastructure as service (IaaS), platform as service (PaaS) and packaged software as service (SaaS) offerings. smartShift is proud to be an AWS Partner with multiple competencies, providing our clients with everything from initial assessments through architecture design and migration, helping them take full advantage of the cloud. By achieving AWS Migration Competency Status, smartShift has proven that it has the skills to help safely and efficiently move our customers to AWS, and that we can successfully execute all phases of complex projects.

“We are happy to add the AWS Migration Competency Status to our list of credentials. This exemplifies our ongoing 10+ year partnership commitment to AWS and our ever-expanding capabilities to bring more customers and more workloads the mutual benefits of smartShift and AWS,” said Vyom Gupta, EVP, Enterprise Cloud Solutions, smartShift Technologies. “With both our philosophies aligned around ‘customer obsession’ and ‘automation,’ we will continue to bring better, faster, and less expensive offerings to our mutual customers.”

As an AWS Advanced Partner, smartShift already holds status as an MSP Partner, Oracle Competency, and Solutions Partner. Adding the AWS Migration Competency Status confirms that smartShift is highly-capable and can expertly navigate the move to AWS. This competency status helps differentiate smartShift from the competition, and provides reassurance to customers that we have the experience and know-how to deliver a successful transition.

To achieve migration status, smartShift had to go through an in-depth audit with AWS. The audit included a rigorous review of actual customer migrations executed by smartShift. Our team had to show a level of proficiency around the completed projects, processes, and tools leveraged to make the client successful.

Our customers can rest assured that our team has a track record of proven success, and that our familiarity with AWS will not only help deliver their migration quickly and successfully, but also enable us to provide them with services and support beyond migration that will make their business more effective and efficient.

If your company is ready to work with a proven partner to migrate to AWS, contact us now to get started.

 

 

Rapidly transform legacy SAP systems with less risk DXC Technology and smartShift Technologies partnership

smartShift Technologies brings the power of intelligent automation to DXC Technology’s portfolio of SAP solutions and services. The partnership enables our SAP customers to gain an unparalleled, automated, end-to-end solution to safely and quickly migrate, modernize and manage SAP applications and infrastructure — a significant step toward digital transformation.

Customers embracing digital change need an innovative and agile partner that can help build and manage their digital core platform. Together, our two companies help customers migrate to S/4HANA —on premises or in a hyperscale cloud infrastructure — faster, safer and cost-effectively, without disrupting the business.

Download PDF to READ MORE

An S/4HANA Conversion to the Cloud

To date, smartShift has completed more than a dozen conversions of ECC 6 systems to S/4HANA. One recent conversion involved a rather unique requirement around infrastructure, and we thought it would make for a good story.

Continue reading

Of application understanding and machine learning or where I am coming from

For the past 25 years, I have been working in the area of automation.  But this is not where I started my career. During my master’s program at USC, I was first introduced to artificial intelligence, a field which immediately fascinated me.  At that time, jobs related to AI were far and few between, but my first job was indeed relevant: working with rule-based decision systems. Since then, the field of artificial intelligence has progressed a lot, for the most part without grabbing headlines.  Only during the recent years it has captured wide attention once more, particularly the field of machine learning. It reminds me of the hype that web and “dot com” introduced in the 1990s. Bubble anyone? Only this time around, the advances in this field have provided a solid foundation to sustain machine learning expansion.  My second job introduced me to the world of application and language understanding, and this is where I have been active for many years. A lot of fascinating things happened during that time, and as you will see things are now headed to where I started. The cycle is now closing.

Understanding application and their code

In the beginning, there was the mainframe.  Beasts of power that could handle a tremendous number of transactions, but which came at a cost.  Next, the “network was the computer” as Sun Microsystems heralded to the world. Companies now had alternatives to the mainframe monoliths.

There was a problem, however.  Successful companies had invested a lot of money in creating mainframe processes that gave them the edge, and of course reams of custom code to go with it.  What to do with all that custom code now? The will was strong but the code was weak, proprietary and incompatible. And we should not forget that relational databases were not yet taken for granted.  Hierarchical and network database management systems were strong and, if you ask me, based on elegant concepts and very good at what they were doing. But history is (re)written by the winners, so servers and relational databases it became.

The role of automation

The word of the day was Migration.   Migration away from the mainframe monoliths to the client-server.  But that required a splitting of applications across layers in a way that would fit the client-server model:  front-end, business and database layers. This was a tall order since the challenge was multi-level:

  • Front-end:  Going from green screen terminals to fancy graphical terminals with WYSIWYG capabilities (yes, WYSIWYG was a feature back then)
  • Network:  State-fullness was the norm, therefore the correct handling of state was paramount
  • Database: Consistent transaction handling, remaining compatible with transaction managers (not everybody wanted or was willing to switch at the same time)
  • New languages:  COBOL was prevalent for business logic.  However the new server world supported additional languages, such as C, C++, and Java.
  • Operating systems:  Enter Unix and its numerous flavors, Windows and X for front-end interaction.

Clearly a disciplined approach was mandatory in order to succeed.  And this is where application understanding and automation came into the picture.  Application understanding may mean different things to different people, but here we will define it as a way of extracting the key attributes and features of the application.  Extraction is done programmatically in a repeatable manner and stored in a repository as a meta-model of the application and its components. The attributes and features need to be carefully selected in a manner that enables smart decision making when it is time to “rewrite” the application. “Rewriting” actually means both refactoring and rearchitecting.  Refactoring applies to the code, rearchitecting to the application. In our case, rearchitecting was the redistribution of the application components to best match the client-server paradigm. In most cases, breaking up monoliths meant both rearchitecting and refactoring.

The object-oriented paradigm

Most of the code we had to deal with in monolithic systems was written in procedural languages or reporting languages such as COBOL.  Also common was the use of generators, mostly for code, screen definitions and database artifacts. Early refactoring was performed using C, which at the time was the lingua franca of Unix for business and database layers.  For front-end coding, to some degree C++ was the best option in order to take advantage of Windows GUI capabilities.

At the dawn of the millennium, Java gained a strong foothold among developers.  By now it has also become the “favorite” in enterprise development. Java was introduced to a wider developer audience and acted as a facilitator of some key aspects of programming:  object oriented programming, separation of concerns, virtual machines, software packaging and interoperability.

Our use of a meta-model and a repository allowed us to build infrastructure and tools that were flexible and platform neutral.  All due to our intelligent automation approach, which allowed pluggable support of source and target languages and the ability to assimilate and make best use of numerous platforms

ERP and SAP(r)

Another important trend that gained steam in the mid-1990s and has kept growing strong ever since is ERP software and platforms.  For us as a company, one particularly important ERP software was and still is SAP. Our intelligent automation approach enabled us to follow its progress over the last 10-15 years, and we will continue to do so in the future.  The earlier application of our automated solution addressed mostly code refactoring and had a strong focus on ABAP, the programming language of SAP ERP. We were and still are present to support SAP customers with our intelligent automation, particularly when SAP started its journey to digitalization with the support of in-memory databases, the launch of S/4 and the move to the cloud.

Closing words

The beginning of the new year is always a good opportunity to take stock, reflect and plan ahead. As you probably know by now, history repeats itself.  Intelligent automation means staying alert and keeping an eye out for new opportunities. As I hinted earlier, I see the cycle now closing after all these years, especially from my perspective.  I am happy to say that we are now adopting machine learning into our automation repository as a further tool to drive application understanding and open new possibilities to our tools and customers.  In that sense, I would like to wish to all of you a successful start of this New Year and of whatever new chapter you are about to open. From our side, our new year’s resolution is to publish more blogs about us, our intelligent automation approach and the technologies we are using.

Niko Faradouris, Senior Technical Architect, smartShift Technologies, Mannheim

 

Automated Transformation: Shift Gears

When your ERP software supplier provides an upgrade or enhancement, or the business demands new functionality only available in a new release, how do you organize the migration of your many customizations?

Most companies start a manual migration process with a time consuming, and labour intensive proof-of-concept while facing an unpredictable migration outcome. Uncertainty remains and the upgrade enhances the risk for the business as it potentially effects the daily operations.

Wouldn’t it be nice to test drive an automated migration and get an exact project plan and budget calculation? With today’s technology that’s possible, eliminating risks and presenting a 100% predictable outcome to the business.

Remove uncertainty through automated migrations

Uncertainty in projects, especially for the migration of customized code, is highly undesirable. Both in time and budget. You want the freeze period to be as short as possible and no budget overrun caused by advancing insight. Even more importantly, you want to have all the insight before the start of the project.

Over the last few years upfront, automated analysis has become available and is offered by a variety of vendors, but what you really want is to automate the migration itself.

Automated migration is more than just automated analysis

An exact prediction of the outcome with automated analysis is only valuable if you can follow-up on the promise; to be able to deliver the full migration through automated transformation in a fixed price, fixed time manner. Only then will you dramatically reduce the risk, cost and cycle times for complex IT transformations.

Experiencing the added value of new enterprise software versions …right now!

The added value of using the newest versions of enterprise software is clear, the road to achieving this might be unsure and unpredictable. But, the possibility is there with automated transformation of custom code with drastically reduced costs and business risks, exceptionally high quality and very short freeze periods.

Companies should focus on the quickest possible way to benefit from new features and functions offered by their enterprise software supplier. Automated transformation is the way forward to quickly materialize the business value of the latest releases of enterprise software.