What comes next after Kubernetes in app-infrastructure?

Jonas Bonér, CTO and co-founder at Lightbend, said, there is a huge gap between the infrastructure and building a full application. This means that, in the near future, they will need to add more tools in the toolbox and to extend the infrastructure model of isolation into the app itself, creating a powerful, yet simple, programming model.

But when a technology has reached a certain level of trust, it’s well-understood and easily managed, you can say that it is ”boring”, thus paying it the best compliment there is. Kubernetes has become just that: it is a standard cloud-enabling plumbing that ”works.”

Tesla, for example, relies on “digital twin” capabilities that power its electric grid, capabilities made possible by the combination of Akka and Kubernetes. Colin Breck, a Tesla engineer, says ”The majority of our microservices run in Kubernetes, and the pairing of Akka and Kubernetes is really fantastic”.

What are the unsolved areas on the cloud-native stack, that are evolving above Kubernetes? According to Boner, there are three: application layer composition, stateful use cases, and data-in-motion use cases.

Stateful use cases

Most of the cloud ecosystem is mainly tackling so-called 12-factor style applications. In the cloud, you’re forced back to the three-layer architecture of pushing everything down into the database every time. This happens unless you have a good model and the tools supporting it.

“The value is nowadays often in the data, and it’s often in the stateful use cases that most of the business value lies — making sure you can access that data fast, while ensuring correctness, consistency, and availability.” Boner says.

Data-in-motion use cases

The Kubernetes ecosystem doesn’t yet offer great support for streaming and event-based use-cases.

“Serverless gets us closer to addressing the problem of extending the model of Kubernetes into the application itself. That’s what it’s all about. Abstracting away as much as possible, and moving to a declarative model of configuration rather than coding, where you define what you should do and not how you do it.” said Boner.

Application layer composition

“People too often use old tools, habits, and patterns, often originating from the traditional (monolithic three-tier) designs that inhibit and constrain the cloud model delivered by Kubernetes,” Bonér says. So what needs to be done is to extend the model of containers, service meshes, and orchestration all the way up to the application/business logic. This way, we will leave the developer with the essence: the business logic and its workflow.

 

All in all, ss the cloud-native stack continues to evolve above the Kubernetes infrastructure level, it will be interesting to see how these concepts play out to serve specific language developers.

Read more on the topic here: https://www.infoworld.com/article/3567648/what-comes-after-kubernetes.html

 

In 2021 it seems we’ve had 2 constants: the world living with the coronavirus pandemic and a steady flow of tech acquisitions.

Global tech merger & acquisitions deals last year totaled $634 billion, a 91.8% year-over-year increase, according to GlobalData. And some of the big deals were the $35 billion acquisition of Xilinx by Advanced Micro Devices and Salesforce’s $27.7 billion acquisition of Slack.

But they’re not the only ones worth looking at. We’ve selected some other tech acquisitions that took place this year which will most likely reshape de tech environment as we know it.

IBM delves into observability for customers

IBM announced the acquisition of Turbonomic at the end of April.

Turbonomic specializes in Application Resource Management (ARM) and Network Performance Management (NPM) software. Turbonomic uses machine learning to spot application performance issues and optimize underlying resources. This applies to containers, VMs, servers, storage, networks, and databases.

This acquisitions will help IBM offer a greater range of AIOps and observability options for customers. This will happen particularly through its IBM Cloud Pak for Watson AIOps.

Microsoft boosts its upstream open-source contributions

Microsoft made a move to boost its capabilities in the Kubernetes space with the acquisition of German firm Kinvolk. This also took place in late April.

Founded in 2015, Kinvolk has been building enterprise-grade tools to help developers adopt cloud-native technologies. These technologies are Kubernetes, Flatcar Container Linux, as well as the Lokomotive and Inspektor Gadget projects.

Microsoft expects to integrate the Kinvolk team and technology into the team responsible for its managed Azure Kubernetes Service (AKS). This will boost Microsoft’s upstream open-source contributions.

UiPath takes step forward for its enterprise-ready platform

On March 23rd, RPA vendor UiPath made an addition of its own, picking up the Denver, CO-based firm Cloud Elements.

Cloud Elements specializes in API integration, similar to Mulesoft and Apigee, which are now part of Salesforce and Google, respectively.

For UiPath, a Romanian-born start-up, this capability could allow customers to better link processes that span various enterprise systems to build more effective automations.

SAP digs deeper into cloud-native enterprise intelligence

German software firm SAP announced it’s acquiring fellow German firm Signavio, which specializes in cloud-native enterprise business intelligence for processes and management, in late January.

This acquisition will add Signavio-designed solutions to the bundle of existing SAP software and services aimed at offering customers “business transformation-as-a-service”.

SAP will aim to use Signavio’s expertise around business process intelligence to help more customers optimize these processes as they become more digital.

Qualcomm strengthens semiconductors market position

No doubt about it: 2020 brought a burst in semiconductors consolidation. So hot on the heels of this phenomenon, Qualcomm announced it was acquiring Nuvia in early January. This 2021 tech acquisition led the way to the upcoming M&A operations of the year.

Nuvia was founded by a team of Apple engineers and makes high-performance CPU chips.

Together, the two companies will be positioned to deliver a new class of products and experiences for the 5G era.

You might also find interesting: RPA usage in SMEs, on the rise in Eastern Europe

A study was recently conducted among small and medium-sized enterprises in Eastern Europe. The subject: the adoption, dissemination and evolution of RPA in business processes.

For these companies, RPA has become increasingly important during the Covid-19 pandemic. Also, for a successful remote work flow.

More to the point, almost 1 in 4 small and medium-sized companies in Eastern Europe confirm that they have a need for software robots, according to a survey published on a business portal.

RPA can help boost efficiency and lower costs

Almost half (47,4%) of the companies believe that intelligent software robots are useful to eliminate repetitive actions.

At the same time, almost 30% of companies believe that the use of software robots can lead to an increase in the efficiency of remote work.

Just as important, companies believe that by integrating RPA technologies they will get lower costs (31,6%) and increased sales (18,4%).

In addition, 39% believe that the team would benefit from the support of RPA technology to leave boring and repetitive tasks to intelligent software robots.

You might also find interesting: Coding activates the brain differently from maths

The dangers of RPA

Only 8% of Eastern European SMEs believe that their teams may be reluctant to integrate technology into the company’s operational processes.

In addition, 29% believe that there may be some fears among employees that software robots could replace certain positions in the company.

High costs, yet viable investment

RPA is an industry that has accelerated strongly in recent years. In Eastern European countries such as Romania, there’s been a rise in global providers of solutions based on this technology.

More to the point, 1 in 4 companies in this area are already using automation technologies. Half of them say they are considering integrating RPA technology into their business this year.

The main hurdle of RPA induction? 1 in 2 companies says the high cost. 1 in 3 companies

believe they will face a lack of training of employees on the adoption and use of such a technology.

Neuroscientists from MIT have discovered that brain activity while coding differs from processing language or doing mathematics.

Coding is matched by many with learning a new foreign language. And, granted, there are certainly many similarities. To the brain itself, however, it seems to be quite different.

Researchers took fMRI brain scans of young adults in a small coding challenge, using both Python and visual programming language ScratchJr. The purpose was to see what parts of their noggins lit up.

Almost no response was seen in the language processing parts of the brain.

Instead, it appears that coding activates the ‘multiple demand network’ of our brains. This area “is also recruited for complex cognitive tasks such as solving math problems or crossword puzzles.”

Yet when solving maths problems directly, slightly different brain activity patterns emerge.

The multiple demand network is spread throughout the frontal and parietal lobes of the brain. Previous studies have found that math and logic problems dominate the multiple demand regions in the left hemisphere. Tasks involving spatial navigation lean on the right hemisphere more than the left.

Coding activates both the left and right sides of the multiple demand network. This counters the belief that it causes the same brain activity as maths. One interesting fact: ScratchJr activated the right side slightly more than the left.

You can find a full copy of the study here: https://www.biorxiv.org/content/10.1101/2020.04.16.045732v2.full.pdf

You might also find interesting: https://www.vonconsulting.net/ai-automated-coding/

Guido von Rossum launched Python on February 20th, 1991. Python is known as an incredibly versatile language. It is used in developing some of the most popular web applications, from Instagram to Dropbox.

At the same time, it is a gateway language for many in the world of software development.

Moreover, it is frequently taught to schoolchildren and people worldwide who lack any prior programming experience.

Read more details here: https://www.vonconsulting.net/study-python-is-the-top-programming-language-of-2020/

One reason for the popularity of this programming language lies in its simplicity. Its users do not need to understand compilers or assemblers. They also don’t need to understand other tiny details programming languages require.

Feedback is instant, and Python is improving all the time. In addition to its popularity among entry-level users, Python is rapidly becoming a priority within the business environment. It has also found favor for serving as the ‘gluing language’.

Large development projects always have a trade-off between scale and speed. The typical software stack that a large organization uses every day may include code written in several different languages. Moreover underlying data may be stored in numerous formats, languages, and locations.

In such environments, Python has taken root as a subtle, but powerful way to bridge between different applications and code libraries.

When Python is used as gluing code in compiled languages, development cycles are shortened. Results are made more interactive and are quicker to observe. At the same time, the delays caused by things such as long compile times are eliminated.

You might also find interesting: https://www.vonconsulting.ro/tech-trends-in-pandemic-times/

Researchers from MIT and Intel have created AI automated coding. Its name? MISIM, an algorithm that can create algorithms. What does that mean for software developers?

For most of us, writing code is like learning a foreign language. But no more! A team of researchers from MIT and Intel are looking to change all that by building AI automated coding.

The new technology is named MISIM (Machine Inferred code Similarity). MISIM studies snippets of code to understand what a piece of software intends to do. It uses a pre-existing catalogue of codes and it can understand the intent behind a new code.

Will this actually help software developers? The Intel-MIT team says yes. MISIM will help developers working on software by suggesting other ways to “attack” a program. MISIM will also aid them in offering corrections and options that will make the code more efficient.

The principle behind MISIM is not new. Technologies that try to determine whether a piece of code is similar to another one already exist. They are used by developers, but they focus on how code is written and not on what it intends to do. MISIM can act like a recommendation system. It suggests different ways to perform the same computation – that are faster and more efficient.

Software development becomes more and more complex. Technologies such as MISIM could have a significant impact on productivity. This was the opinion of Justin Gottschlich, the lead for Intel’s machine programming research team.

More details about the MISIM algorithm here: https://www.zdnet.com/article/software-developers-how-plans-to-automate-coding-could-mean-big-changes-ahead/

You might also find interesting: https://www.vonconsulting.net/study-tech-trends-in-pandemic-times-1-in-4-people-learned-to-code-during-lockdown/

Although it was created 30 years ago, Python seems to be holding firm ground in the top programming languages of 2020, according to a study cited by www.developer-tech.com.

Ideally suited for artificial intelligence and web development, still considered easy to learn and taught in many universities worldwide, Python ranked 100 points in a survey meant to identify not only programming languages preferences, but also programmers’ unique needs and interests.

Java takes second place

Newer – and younger – than Python, Java was created 25 years ago and is known for its versatility with the language powering mobile, desktop, and web applications/games.
It surges in programmers’ preferences and usage ahead of Ruby, R, Arduino – to name but a few – and it is still a popular option for the world’s most used mobile operating system, Android, despite intense lobby for Java to be replaced by Kotlin.
Java ranked in at 95,3 points of the scorecard.

Just 1 point behind Java – C

C is the oldest of the top three languages ranking in this survey. C was created 48 years ago, in 1972, and continues to be the language primarily used for system development work, such as drivers and operating systems, but also applications that require large amounts of calculations.

C and C++ ranked in the study afore mentioned at 94,6 points and 87 points respectively.

The R-ise of R

Universities and research institutes embrace Python and R for their statistical analyses. And now more than ever, lots of statistics and data mining need to be done to find a vaccine for the Covid-19 virus. As a consequence, statistical programming languages that are easy to learn and use have gained noticeable popularity now.

A statistical language, R saw an interesting increase in another ranking, comprised in the TIOBE index, just behind Visual Basic, with the top 5 being held by C, Java, Python, C++, C#.

 

Article brought to you by VON Consulting Tech Division. People. Quality. Tech.
VON Consulting Tech Division is a start-up operating also in Düsseldorf, Germany, which provides hardware design and verification services, IT support and software development for customers in different industries, mainly in IT, telecom, and networking and semiconductors industries. See more on http://www.vonconsulting.net.

Doors shut on many plans during the Covid-19 pandemic. Plans turned to day-by-day approaches, particularly where daily livelihood was concerned and job stability had to navigate mass layoffs and furloughs in some fields.

Yet, despite this rather asperse scenery, the following phenomenon happened: employees turned to learning new skills to keep their leverage on the working market, as well as to garner a new feeling of personal development. Case in point: technology proficiency.

The golden top 3 podium: Python, Java and C++

According to a study that collected data in August 2020 from more than 1,000 people in the United States, which is cited by www.developer-tech.com, around 1 in 4 people spent time learning coding languages during the lockdown.
The most commonly learned programming language were Python, followed by Java and C++.

Millennials, most engaged with new tech trends

70% of the study respondents said their technology skills moderately or greatly improved since the Covid-19 breakout. Breaking it down by generation groups, millennials, at nearly 3 out of 4 respondents, were the most likely to have improved their tech skills with Generation X not far behind.
Baby boomers were considerably less likely to report any tech improvement; still, over half said they were more skilled now than they were before the Covid-19 pandemic.

Biggest motivation to learn code: career development

The greatest motivations for people setting out to improve their skills were career development (55%), personal development (46%), and improving job search prospects (33%).

Online e-zines, online channels, and mostly freely available content, was the top source of training material for most (66%) people boosting their skills, with 1 in 3 turned to paid resources.

On average, people spent 7,2 hours per week improving their tech skills, the most time learning coding and programming languages, while improving telecommunication proficiency required the least study time.

One other interesting aspect to consider: people who had taken advantage of employee-provided training opportunities were much more inclined to pursue development on their own or through paid resources.
Over one-third of respondents (37%) whose employers didn’t offer technology education opportunities reported wishing their employer would do so.

Overall, close to 1 in 2 respondents believe their new or improved tech skills will be very or extremely beneficial to their career.

Article brought to you by VON Consulting Tech Division. People. Quality. Tech.
VON Consulting Tech Division is a start-up from San Diego, CA, which provides hardware design and verification services, IT support and software development for customers in different industries, mainly in IT, telecom, and networking and semiconductors industries. See more on http://www.vonconsulting.net.

Graph databases store information as nodes and data specifying their relationships with other nodes. They are proven architectures for storing data with complex relationships.

Graph database usage has grown during the past decade, despite companies considering other NoSQL and big data technologies.

The global graph database market was estimated at $651 million in 2018 and is forecasted to grow to $3.73 billion by 2026.

Competitors remain in the range of other big data management technologies, including Hadoop and Spark. And competition grew in popularity, skill adoption, and production use cases more than graph databases.

Graph databases and query languages

Developers think in objects and use hierarchical data representations in XML and JSON regularly.

For graph databases, although it may be relatively easy to comprehend the modeling of nodes and relationships used, querying them requires learning new practices and skills.

Developers can query Neo4j graph databases using Resource Description Framework (RDF) and Gremlin, but 90% prefer to use Cypher.

The query is elegant and efficient but has a learning curve for those used to writing SQL queries. Here’s one of the first challenges for organizations moving toward graph databases: SQL is a pervasive skill set, and Cypher and other graph query languages are a new skill to learn.

Graph databases can be used in flexible hierarchy design

Product catalogs, content management systems, project management applications, ERPs and CRMs all use hierarchies to categorize and tag information. Graph databases enable arbitrary hierarchies and developers to create different views of the hierarchy for different needs.

To take advantage of flexible hierarchies, it helps to design applications from the ground up with a graph database. The entire application is then designed based on querying the graph and leveraging the nodes, relationships, labels, and properties of the graph.

Graph databases and cloud deployment – reduced operational complexity

Deploying data management solutions into a data center must consider infrastructure and operations, security requirements, review performance considerations to size up servers, storage, and networks, as well as replicated systems for redundancy and disaster recovery.

Organizations experimenting with graph databases now have several cloud options. Engineers can deploy Neo4j to GCP, AWS, Azure, or leverage Neo4j’s Aura, a database as a service.

The public cloud vendors have graph database capabilities, including AWS Neptune, the Gremlin API in Azure’s CosmoDB, the open source JanusGraph on GCP, or the graph features in Oracle’s Cloud Database Services.

Year over year it’s interesting to study the evolution of in-demand skills in the software industry and this year for sure makes for an interesting time frame to study.

According to an article published by www.developer-tech.com, which cites a study made by career website Hired, the most notable surge in 2020 – where demand for software engineers in the US is concerned – is for AR/VR talent – with a whopping 1400% increase in comparison to 2019.
The explanation is very simple: as per IDC predictions, the AR/VR market and subsequent need for skilled software engineers was enjoying about 60% of the total spending on software solutions in 2018. Within a 3 year margin, by the end of 2021, it is expected to hit 85%, with retail, transportation, manufacturing and public sectors needing services from these software engineers on the top of the chart.

AR/VR and why it’s so in demand in the United States

On a geographical basis, North America is found to be the region that invested heavily in the AR/VR market in the past 12 months and is forecast to witness the fastest growth in the next 5 years. Moreover, salaries for AR/VR software engineers jobs range from $135k – $150k in major US tech hubs. Monetary incentives aside, developers are also looking to get started toying with the emerging technology, with 46% of software engineers ranking AR/VR as one of the top 3 technologies they’d like to learn in 2020.

Gaming and computer vision engineers come in 2nd and 3rd

After VR/AR, the second biggest growth of in-demand talent was seen for ‘gaming engineers’ and ‘computer vision engineer’ roles – both witnessing 146% growths over 2019.
Demand for ‘search engineers’ increased 137%, whereas for ‘machine learning engineers’ increased 89%. Blockchain talent is still in demand, shy off 2019, with a 9% increase.

Most in demand programming languages

As per the study, some of the most in-demand programming languages are Go, Scala, Ruby, TypeScript, Kotlin, Objective C, JavaScript, Swift, PHP, Java, HTML, and then Python.
Some of the less in-demand languages are, unsurprisingly, some of developers’ favourites. Python, JavaScript, and Java are developers’ favourite languages but are behind several other languages in demand – including three of developers’ least favourites (Ruby, PHP, and Objective-C).