A realistic workload model is the core of a solid performance test. Generating load that does not reflect reality will only give you unrealistic feedback about the behavior of your system. That's why analyzing the traffic and application to generate your performance strategy is the most important task for creating your performance testing methodology. To help one of my clients build a realistic performance testing strategy, I built a program that extracts the use of its microservices in production. The objective was to present the 20% of calls that represent 80% of the production load. Through this extraction, the program guides the project in building a continuous testing methodology for the client's main microservices. One of the biggest limitations is the lack of information stored in the http: logs or the data stored in APM products. Unfortunately, there is just too much missing information to automatically generate the load testing scripts. Technically, with a tool like my prototype, you'll have everything you need to build test scripts, test definitions, and test objectives.
The DevOps culture necessary to effectively use open source, cloud native technologies has fundamentally changed software and team processes. It is expanding how we work and think. For some, this presents an exciting opportunity. Others approach it with more trepidation. Startups, in general, are on board. They don’t have entrenched technology that needs to be maintained and upgraded. They are also able to hire people whose skill sets are a good fit with newer technologies. For enterprises, it’s a bit tougher. They have massive investments in workhorse technologies and platforms such as Java and WebLogic. But they also have IT teams with deep heritage and operational knowledge in building, deploying, running and maintaining applications over decades. Understandably, their developers don’t necessarily want to become experts in infrastructure and in projects such as Kubernetes. They may not see the value in having novices muck around with it. As long as the developer and operations teams remain separate, they each have a measure of power and a measure of comfort.
Machines and devices are everywhere, connected—and multiplying. These are the “things” of the Internet of Things, and today there are nearly three devices attached to the internet for every human on the planet. By 2025 that ratio will soar to 10 to 1. For consumers, that means their thermostats and refrigerators can be connected to real-time, sophisticated analytics engines that automatically adjust them to be more efficient and save more money. But what does that mean for businesses? Well, just as it’s doing for consumers, IoT is helping businesses streamline operations, save money and time with real-time, actionable intelligence, and prevent problems with predictive analytics. But there’s a dark side to IoT. Frankly, it’s the concerning underbelly that exists in all connected technologies: lacking security. We already see massive DDoS attacks driven by IoT devices. Experts concede that is just the tip of the iceberg. In all, analysts project the global IoT market to exceed the $1 trillion mark in 2022. Today, companies in every industry rely on IoT as part of their business strategy.
“We find ourselves at a critical stage. For me, the crucial, crucial change the law brought was around accountability. Accountability encapsulates everything the GDPR is about.” Denham said the GDRP enshrines in law an onus on companies to understand the risks that they create for others with their data processing, and to mitigate those risks. It also formalises the move away from box ticking to seeing data protection as something that is part of the cultural and business fabric of an organisation, and it reflects that people increasingly demand to be shown how their data is being used, and how it is being looked after, she added. However, she said this change is not yet evident in practice. “I don’t see it in the breaches reported to the ICO. I don’t see it in the cases we investigate, or in the audits we carry out,” she said. Denham said this is both a problem and an opportunity. “It’s a problem because accountability is a legal requirement, it’s not optional. But it is an opportunity because accountability allows data protection professionals to have a real impact on that cultural fabric of your organisation,” she said.
Many companies use design thinking to improve the customer experience. After finding it useful in the CX realm, businesses now try to apply similar approaches to improve employee engagement. Electronic Arts (EA) Inc. found this approach helpful to improve the engagement of call center employees who typically experience the brunt of customer complaints. "No one ever calls us when something good is happening," said Abby Eaton, manager of employee experience at EA. "They are calling because something has gone wrong and they are already frustrated, so the complexity of the advisers' jobs is challenging." Design thinking can help improve the design of a space, physical products and applications and has been a trend since the 1990s. Now, companies are applying this same approach to improve applications in the workplace -- cutting costs and improving worker productivity, said Parminder Jassal, Work and Learn Futures group director at the Institute for the Future, a think tank in Palo Alto, Calif.
Transparency works well for Bitcoin's blockchain but it might not suit say a large company's supply-chain system where it doesn't want suppliers and contractors to see each other's transactions. Immutability is a double-edged sword: if a fraudulent or erroneous transaction is recorded on the blockchain, there's no easy way to amend or delete it. The only way to fix that is to go back in time on the blockchain, and start again at that point to invalidate the transaction, provided everyone in the network agrees to do that. This effectively creates a new version of the software, and thus a new cryptocurrency that's not compatible with the older one. Not being able to delete or amend information could also make blockchain data stores incompatible with tightening global privacy rules that give individuals the right to "be forgotten" and have their details deleted if they so wish. Muir says we don't know the answer to that yet. Likewise, accessing blockchain data requires the use of a digital cryptographic key that has to be kept secure.
The delivery pipeline in DevOps consists of feedback loops that allow you to inspect, reflect, and decide if you are still doing the right things in the right way. As you get better and smarter and learn more, you'll see ways to improve, to optimize, to cut out steps that are not providing value. Often those improvements require some investment and extra effort to implement. If you don't take the time to fix the pipeline when you see the ways to improve, you are just investing in a wasteful process. You are doing the process for the sake of the process, not to add the maximum value to what you are delivering. The sooner you improve, the sooner you reap the benefits of that improvement. It isn't just a matter of reviewing the process twice a year or every quarter. Continuous improvement is a cultural shift that says everyone should get better all the time. Every time you go through the process, you get a little better and learn a little more.
What many people failed to realise is that Hadoop itself is more of a framework than a big data solution. Plus, with its broad ecosystem of complementary open source projects for most businesses Hadoop was too complicated. It needed a level of configuration and programming knowledge that could only be supplied by a dedicated team to fully leverage it. Even when there was a dedicated internal team, it sometimes needed something extra. For instance, one of Exasol’s clients, King Digital Entertainment, makers of the Candy Crush series of games, couldn’t get the most out of Hadoop. It wasn’t quick enough for interactive BI queries that the internal data science team demanded. They needed an accelerator on a multi-petabyte Hadoop cluster which allowed their data scientists to interactively query the data. The world of data warehousing has changed in recent years, and Hadoop has had to adapt. The IT infrastructure of 2009-2013, when Hadoop was at the peak of its fame, differs greatly from the IT infrastructure of today.
Quote for the day:
"Leaders need to strike a balance between action and patience." -- Doug Smith