Showing posts with label data discovery. Show all posts
Showing posts with label data discovery. Show all posts

Daily Tech Digest - December 29, 2021

Hackers Are Getting Better and Better At Defeating Your 2FA Security

Two-factor authentication, or 2FA, has been sold to web users as one of the most important and trustworthy tools for securing your digital life. You probably know how it works: By supplying an account with not just your password but also a secondary piece of information (typically an automated code texted to your phone or device of choice), companies can verify that whoever signs into your account is definitely you and not just some goon who’s managed to get their hands on your personal information. However, according to new research, said goons have unfortunately found a number of effective ways to get around your 2FA protections—and they’re using these methods more and more. The study, put out by academic researchers with Stony Brook University and cybersecurity firm Palo Alto Networks, shows the recent discovery of phishing toolkits that are being used to sneak past authentication protections. Toolkits are malicious software programs that are designed to aid in cyberattacks. 


5 HR Trends You will be Paying Attention in 2022

Digitisation has been a hot topic in the last few years. We are now seeing rapid adoption of HR technology right from “Hire to Retire”. Automation in HR has now become a reality that has nearly touched all aspects of human resource management. As we move forward, the expectation would be to have all HR/Employee services on the palms of an Employee. Digitisation would enable to overcome HR challenges and improve business efficiency and workplace productivity. While most of the things will move to Digital, many elements in HR (of being humane) cannot be replaced. Teams will become more complex, and employees would therefore need more human attention and empathy. AI will continue to help Organisation in a lot of smart ways, but when it comes to physical and mental well-being concerns, HR will have to continue to tackle employee issues more emphatically. Retaining the talent was always one of the key priorities but now the focus would be more to keep employees happy and willing to stick around. Covid outbreak has forced the workforce to reprioritise their career and choose employers/jobs which fits into their new priorities.


The tech trends of next year: Innovations that will reshape business in the future

Automation has been with us for centuries, but its application in the digital age is something entirely new. We're seeing it creep into almost every aspect of our lives, with far-reaching and never-before-seen results. Take, for instance, a sporting event such as the 2021 Formula 1 Heineken Dutch Grand Prix, where fans could order tickets, procure refreshments, secure purchases and get all their questions answered through a chatbot that runs through WhatsApp. With the help of data reporting and features such as suggested answers, internal collaboration and skill-based routing, the Dutch Grand Prix could interact with fans efficiently and effectively. This was made possible by deploying a customer data platform (CDP) to manage event pre-registration data, and using marketing cloud software to introduce attendee reductions due to the pandemic. The technology helped create a seamless experience for both event organisers and fans. While this kind of technological automation has been focused around improving customer experience, it has the potential to impact employee experience (EX) as well.


Importance of Data Discovery in Data Mesh Architecture

The full notion of data mesh is the recognition that how we model, produce and consume data is decoupled. With decoupled data, the common concern is If users need to access data or services that they didn’t create, how are they going to find it and learn to use it? It is this part of the data mesh that affects the data discovery the most. Data mesh splits up the centralized data into data domains and allows users to apply high-quality data product thinking to how data is shared. Data discovery is essentially a capability of enabling data and control plane on the data mesh, which creates a better environment for discovering and tagging data. The companies that already have a data mesh model need a data discovery platform initially to discover and understand their data which is where discovery starts with data mesh. Then, as the data teams start owning their data by putting tags and ownership, data mesh allows these teams to invite other users through democratized access to data while maintaining full governance and control over a source of truth with distributed ownership — this is the main intersection of discovery and its role in data mesh.


22 Habits to Become a Better Data Scientist in 2022

It doesn't matter how skillful you’re at coding or creating models, if you can’t understand the business you’re in, you won’t succeed as a data scientist. No one is going to value your work if you don’t help the company you work for reach its goal — no matter how good your model performs. Do some research to know more about the company you work for, the industry they’re in, and take some time to think how a data scientist like you can help reach some of the goals the company has. Asking questions to your boss and colleagues helps a lot too. They can help better understand your role in the company. Every year a new technology rises, tools get obsolete and code becomes deprecated. This is why you should always keep an eye on the new stuff in the field by reading blogs, research papers, and books. Staying up to date will help you take advantage of the latest advancements in data science. As a result, you will be one step ahead of the rest and in most cases get the best performance possible in a project. Being open to learning how to use new tools and adapting to change is a habit that will help you in your career (this leads us to our next habit)


Why DevSecOps offers the ‘most transformative’ approach to application security

Clearly, DevSecOps requires a high degree of trust between the developer and security sides of the organization, according to Dooley. In part, that’s because DevSecOps is ultimately best delivered through automating security as much as possible during app development. For getting to a true DevSecOps program, security teams must start by providing data to developers that is presented in the form in which they operate—which for many DevOps teams is through a Jira ticket, Dooley said. “Show up in the packaging and format that they’re used to, and supply them with all the information that they need to do to just treat [security issues] like a bug or like a feature,” he said. Thus, the first level on the journey to DevSecOps can involve supplying developers with a secure code sample that fixes a certain issue in the code, Dooley said. But this secure code still needs to be implemented manually. At the next level, companies can enable semi-automated remediation, he said. This can involve automatically disabling issues that are creating a security exposure. With this approach, a human still has to sign off on the final build.


5 movies every HR professional must watch in 2022

The 2009 Oscar winning sports drama starring Sandra Bullocks was released at a time when racial bias and discrimination was quite prevalent in the world. This heart-rending story is about a seventeen year old African American foster child who is taken into care by a rich southern family whose matriarch found something precious inside him that no one could - a striving will for excellence. What HR professionals can learn from the movie - This movie is a classic example of the ‘nature vs nurture’ discussion. Sandra Bullock’s Oscar winning character practices what she preaches i.e equality, care and sensitivity beyond the societal prejudice and bias and utilising it to unleash the potential of the person. Earlier, HR leaders may tend to overlook the smallest details of human personalities but in the new world of work, these nuances have become quite integral in managing your workforce and motivating them forward. It is important to understand every contributor is bringing something to the table and just like Leigh Tuohy from the movie, it is really important for the HR leaders to distinguish the difference in experience of their employees and nurture them to their strengths.


When employees leave, is your data walking out the door?

Everything from customer lists to intellectual property is in danger of being exfiltrated and for a variety of reasons. It may be that employees feel entitled to the intellectual property they helped create. It may be that they want to jump start their work with a new company and that taking data, in their minds, is no crime. It may be that they’re disgruntled and looking to do harm by selling the data to a competitor. In the most extreme case, the employees in question may even be cybercriminals-for-hire who infiltrate enterprises with the sole purpose of causing harm. Amid all these possibilities, it is no longer reasonable for organizations to risk leaving the protection of their most valuable data up to chance. Organizations with valuable intangible assets must consider performing digital forensic scans as part of their standard operating procedures when offboarding employees, to identify threat actors and the data they’re exfiltrating. Doing so would enable organizations to better manage the risk of insider activity and give them the means to recover critical data and in some instances even seek legal recourse.


CISOs will prioritise secure automation in 2022

External threats are not the only problem. The disruption of the past 18 months has seen many organisations realise that their strategic security posture was behind the curve when the request for remote work became a requirement. For example, research conducted during the summer of 2020 found that the biggest threat noted by security departments was the inability to implement multi-factor authentication to facilitate secure remote worker access, something that should have been in place already. However, with resources stretched to breaking point by the immediate demands of the almost exclusively remote workforce, many organisations had to leave strategic security projects on the back burner. Now, as they urgently try to catch up, security teams are also facing an industry-wide shortage of talent. The latest data from the 2021 (ISC)2 Cybersecurity Workforce Study estimates that an additional 700,000 professionals have joined the cyber security sector, but that the gap between the number of additional professionals needed to adequately defend organisations and the number currently available stands at 2.7 million. 


Best practices for developing governable AI

Due to the ever-evolving landscape of open source libraries, vendors, and approaches to building ML models as well as the shortage of qualified ML engineers, there is a significant lack of industry best practices for creating deployable, maintainable, and governable ML models. When developing ML models with governance in mind, the most important considerations are reperformance, version control, interpretability, and ease of deployment and maintainability. ... Reperformance is the ability to reperform or reproduce a transaction or a model training and obtain identical results. Much has been said about the “reproducibility crisis” in science, and the AI/ML community is not immune from this criticism. Creating machine learning systems that are reproducible is definitely possible, and putting in the effort up front to do so ultimately yields more robust, dependable deployments, fewer headaches for developers, and fewer questions from auditors and other reviewers.



Quote for the day:

"Be willing to make decisions. That's the most important quality in a good leader." -- General George S. Patton, Jr.

Daily Tech Digest - October 20, 2021

The challenges of cloud data management

IT departments are facing a growing challenge to stay abreast of advancements in cloud technologies, provide day-to-day support for increasingly complex systems, and adhere to ever-changing regulatory requirements. In addition, they must ensure the systems they support are able to scale to meet performance objectives and are secured against unauthorized access. ... Much like data security, adhering to regulatory compliance frameworks is a shared responsibility between the customer and cloud provider. Larger cloud vendors will provide third-party auditor compliance reports and attestations for the regulatory frameworks they support. It will be up to each organization to read the documentation and ensure the contents meet specific compliance needs. Most leading platforms will also provide tools to help clients configure identity and access management, secure and monitor their data, and implement audit trails. But the responsibility for ensuring the tools' configuration and usage meet the framework's control objectives relies solely with the customer. ... We know one of IT's core responsibilities is to transform raw data into actionable insights.


Learning to learn: will machines acquire knowledge as naturally as children do?

We create new-to-the-world machines, with sophisticated specifications, that are hugely capable. But to reach their potential, we have to expose them to hundreds of thousands of training examples for every single task. They just don’t ‘get’ things like humans do. One way to get machines to learn more naturally, is to help them to learn from limited data. We can use generative adversarial networks (GANs) to create new examples from a small core of training data rather than having to capture every situation in the real world. It is ‘adversarial’ because one neural network is pitted against another to generate new synthetic data. Then there’s synthetic data rendering – using gaming engines or computer graphics to render new scenarios. Finally, there are algorithmic techniques such as Domain Adaption which involves transferable knowledge (using data in summer that you have collected in winter, for example) or Few Shot Learning, which making predictions from a limited number of samples. Taking a different limited-data route is multi-task Learning, where commonalities and differences are exploited to solve multiple tasks simultaneously.


IT hiring: 5 signs of a continuous learner

Whatever you call it, it’s an important attribute to consider when hiring or grooming the most capable IT professionals today. A continuous learner can offer more bang for the buck in one of the strongest job markets in recent years. “We have found that many companies, while their job descriptions state they are looking for a certain number of years of experience in a laundry list of technologies, are being more flexible and hiring candidates that may be more junior, or those who lack a few main technologies,” Spathis says, noting that many organizations are willing to take the risk on more junior or less specifically experienced candidates who are eager, trainable, and able to learn new skills. There’s definite agreement on the demand for continuous learners in the IT function today. “To thrive during these changing times, it’s imperative that IT organizations continuously grow and change with changing needs,” says Dr. Sunni Lampasso, executive coach and founder of Shaping Success. “As a result, IT organizations that employ continuous learners are better equipped to navigate the changing work world and meet changing demands.”


Ethical and Productivity Implications of Intelligent Code Creation

AI technology is changing the working process of software engineers and test engineers. It is promoting productivity, quality, and speed. Businesses use AI algorithms to improve everything from project planning and estimation to quality testing and the user experience. Application development continues to evolve in its sophistication, while the business increasingly expects solutions to be delivered faster than ever. Most of the time, organizations have to deal with challenging problems like errors, defects, and other complexities while developing complex software. Development and Testing teams no longer have the luxury of time when monthly product launches were the gold standard. Instead, today’s enterprises demand weekly releases and updates that trickle in even more frequently. This is where self-coded applications come into play. Applications that generate the code themselves help the programmers accomplish a task in less time and increase their programming ability. Artificial intelligence is the result of coding, but now coding is the result of Artificial intelligence. It is now helping almost every sector of the business and coders to enhance the software development process. 


How To Transition From Data Analyst To Data Scientist

Before even thinking about making the transition, one has to be very clear about what a data scientist does and introspect what has to be done to fill the gaps that are needed to make the transition and the skills the person has now. A data scientist not only handles data but provides much deeper insights from it. Other than gaining the right mathematical and statistical know-how, training yourself to look at business problems with the mindset of a data scientist and not just like a data analyst will be of great help. This means that while looking into a problem, developing your critical thinking and analytical skills, getting deep into the problem to be solved at hand, and coming up with the right way to approach the solution will train you for the future. A data analyst might not have great coding skills but surely has to know it well. Data scientists use tools like R and Python to derive interpretations from the massive data sets they handle. As a data analyst, if you are not great at coding or don’t know the common tools, it would be wise to start taking basic courses on them and use them then in real-world applications.


Application Security Manager: Developer or Security Officer?

First, an ASM has to understand what a supervised project is about. This is especially important for agile development, where, unlike the waterfall model, you don’t have two months to perform a pre-release review. An АSМ’s job is to make sure that the requirements set at the design stage are correctly interpreted by the team, properly adopted in the architecture, are generally feasible, and will not cause serious technical problems in the future. Typically, the ASM is the main person who reads, interprets, and assesses automated reports and third-party audits. ... Second, an ASM should know about various domains, including development processes and information security principles. Hard skills are also important because it’s very difficult to assess the results provided by narrow specialists and automated tools if you can’t read the code and don’t understand how vulnerabilities can be exploited. When a code analysis or penetration test reveals a critical vulnerability, it’s quite common for developers (who are also committed to creating a secure system) to not accept the results and claim that auditors failed to exploit the vulnerability. 


Top Open Source Security Tools

WhiteSource detects all vulnerable open source components, including transitive dependencies, in more than 200 programming languages. It matches reported vulnerabilities to the open source libraries in code, reducing the number of alerts. With more than 270 million open source components and 13 billion files, its vulnerability database continuously monitors multiple resources and a wide range of security advisories and issue trackers. WhiteSource is also a CVE Numbering Authority, which allows it to responsibly disclose new security vulnerabilities found through its own research. ... Black Duck software composition analysis (SCA) by Synopsys helps teams manage the security, quality, and license compliance risks that come from the use of open source and third-party code in applications and containers. It integrates with build tools like Maven and Gradle to track declared and transitive open source dependencies in applications’ built-in languages like Java and C#. It maps string, file, and directory information to the Black Duck KnowledgeBase to identify open source and third-party components in applications built using languages like C and C++. 


Why You Don't Need to Be a Business Insider in Order to Succeed

No matter what anyone tells you, it’s not a zero-sum game. There is abundance out there for everyone. Of course, money becomes concentrated with various people, but wealth-mobility is very real and happening all the time. We hear people talk about the 1% all the time (often in an effort to paint them as a monolithic, evil, controlling class). What they fail to recognize is that people are constantly moving in and out of the 1% all the time. Some of this is down to inherited wealth, and some is down to hard work — but it’s happening all the time. What really lies at the heart of this is fear. We abdicate our power to an imagined ruling class because we’re afraid of the unknown. And before you think this is about blaming you: It is our subconscious being unwilling to take the risk that stops us. You have a built-in stowaway in your mind who wants to maintain a status quo. Therefore, any new growth opportunities — while intellectually exciting and appealing — will be met with emotional resistance at some point. I’m sure you’ve had this happen to you before: You get a new career-changing offer, you do a little dance and head off to celebrate. 


Why a new approach to eDiscovery is needed to decrease corporate risk

For businesses, the combination of these factors has led to a big increase in corporate risk, putting significant pressure on any corporate investigations that need to be conducted and making the eDiscovery process much more difficult. Not only are employees and their devices a lot less accessible than they used to be, but the growing use of personal devices, many of which lack proper security protocols or use unsecured networks, leaves company data much more vulnerable to theft or loss. If that wasn’t enough, heightened privacy concerns and the likelihood that personal data will be unintentionally swept up in any eDiscovery processes can make employees even more reluctant to hand over their devices to investigators if/when needed (if investigators can even get hold of them). As a result, many companies are suddenly finding themselves between a rock and a hard place. How can they operate a more employee friendly hybrid working model while still maintaining the ability to carry out corporate investigations and eDiscovery in the event it’s required?


Three key areas CIOs should focus on to generate value

CIOs and IT executives should focus on three types of partner connections: one-to-one, one-to-many and many-to-many. A one-to-one connection can be taken to the next level and become a generative partnership where the enterprise and technology partner work together to create and build a solution that doesn’t currently exist. The resulting assets are co-owned and produce benefits and revenue for both partners. Generative partnerships are becoming more common. In fact, Gartner forecasts that generative-based IT spending will grow at 31% over the next five years. Beyond one-to-one connections is the formation of ecosystems of multiple partners. One-to-many partnerships work best when a single enterprise needs to focus many players on jointly solving a single problem – such as a city bringing together public and private entities to serve the citizen. Many-to-many partnerships are created when a platform brings many different enterprises’ products and services together, to be offered to many different customers. Often called platform business models, these marketplaces and app/API stores enable the many to help the many at ecosystem scale.



Quote for the day:

"Leaders are people who believe so passionately that they can seduce other people into sharing their dream." -- Warren G. Bennis

Daily Tech Digest - May 08, 2021

Gartner says composable data and analytics key to digital transformation

Gartner said business-facing data initiatives were key drivers of digital transformation in the enterprise. Research showed that 72% of data and analytics leaders are leading, or are heavily involved, in their organizations’ digital transformation efforts. These data leaders now confront emerging trends on various fronts. XOps: The evolution of DataOps to support AI and machine learning workflows is now XOps. The X could also stand for MLOps, ModelOps, and even FinOps. This promises to bring flexibility and agility in coordinating the infrastructure, data sources, and business needs in new ways. Engineering decision intelligence: Decision support is not new, but now decision-making is more complex. Engineering decision intelligence frames a wide range of techniques, from conventional analytics to AI to align and tune decision models and make them more repeatable, understandable, and traceable. Data and analytics as the core business function: With the chaos of the pandemic and other disruptors, data and analytics are becoming more central to an organization’s success. Companies will have to prioritize data and analytics as core functions rather than as secondary activity handled by IT.


Everything you need to know to land a job in data science

What does it take to get hired? Organizations are looking for job candidates with a bachelor's or master's degree in computer science, as well as experience with data modeling tools, XML, Python, Java, SQL, AWS and Hadoop. Many data scientist job descriptions also mention the ability to work with a distributed and fast-moving team. Interpreting data for colleagues in business units is increasingly important as well. Ryan Boyd, head of developer relations at Databricks, said that data science will soon be a commonplace skill outside engineering and IT departments as data becomes increasingly fundamental to businesses. "To stay competitive, data scientists need to be equally as obsessed with data storytelling as they are with the minutiae of data software and programs," said Boyd. "Tomorrow's best data scientists will be expected to translate their know-how into actionable insights and compelling stories for different stakeholders across the business, from C-suite executives to product managers." Whether you are looking for your first data science job or figuring out your next career move in the field, the following advice from hiring managers and data science professionals will help you plot a smart and successful course.


Observability and GitOps

The old supervision methods have reached their limits in the supervision of the new standards of application architecture. The management of highly scalable and portable microservices requires the adaptation of tools in order to facilitate debugging and diagnosis at all times, thus, requiring the observability of systems. Often, monitoring and observability are confused. Basically, the idea of a monitoring system is to get a state of the system based on a predefined set of metrics to detect a known set of issues. According to the SRE book by Google, a monitoring system needs to answer two simple questions: “What’s broken, and why?” Analyzing an application over the long term makes it possible to profile it in order to better understand its behavior regarding external events and, thus, be proactive in its management. Observability, on the other hand, aims to measure the understanding of a system state based on multiple outputs. This means observability is a system capability, like reliability, scalability, or security, that must be designed and implemented during system design, coding, and testing.


Defending Against Web Scraping Attacks

Web scraping can easily lead to more significant attacks. At my company, we routinely use Web scraping as one of the initial steps in a red team or phishing engagement. By pulling the metadata from posted documents, we can find employee names, usernames, and deduce username and email formats, which is particularly helpful when the username format would otherwise be difficult to guess. Mix this with scraping a list of current employees from sites like LinkedIn, and an adversary can perform targeted phishing and credential brute-force attacks. ... Scraping document metadata is also useful for detecting internal hostnames and software versions in use at the targeted company. This enables an attacker to customize the attack to exploit vulnerabilities specific to that company, and it is an important part of victim reconnaissance. Adversaries can also use scraping to collect gated information from a website if that information isn't properly protected. Take Facebook's password-reset page: Anyone can find privately listed people through a simple query with a phone number. While a password-reset page may be necessary, does it really need to confirm or, worse, return a user's private information?


From DevOps to MLOPS: Integrate Machine Learning Models using Jenkins and Docker

Continuous integration (CI) and continuous delivery (CD), known as CI/CD pipeline, embody a culture with agile operating principles and practices for DevOps teams that allows software development teams to change code more frequently and reliably or data scientist to continuously test the models for accuracy. CI/CD is a way to focus on business requirements such as improved models accuracy, automated deployment steps or code quality. Continuous integration is a set of practices that drive development teams to continuously implement small changes and check in code to version control repositories. Today, data scientists and IT ops have at their disposal different platforms (on premises, private and public cloud, multi-cloud …) and tools that need to be addressed by an automatic integration and validation mechanism allowing building, package and test applications with agility. Continuous delivery steps in when continuous integration ends by automating the delivery of applications to selected platforms.


Data Discovery for Business Intelligence

Any company that has had a BI tool for more than a year will deal with the dashboard clutter problem. Ad-hoc analysis, quarterly reports, and even core dashboards get outdated or change to a new version over time. The problem is, old dashboards usually don’t get deleted. No one wants to delete a dashboard in the shared folder because someone might be using it. This creates a long tail of clutter and inactive reports that people may poke around in, but they won’t be sure if the data is reliable or relevant. Navigating BI tools becomes its own tribal knowledge task and, it ends up being best to ask others to send you a specific link to open. What could be worse is that there may be someone relying on an outdated dashboard for their day-to-day operations. This often happens because dashboard metadata and its freshness isn’t tracked automatically. Connecting dashboard metadata along with its operational metrics like the last successful report run, last edited time, and top users can give visibility into the health of the dashboard. By comparing usage data along with operational metrics, outdated data models can easily be identified and cleaned out.


Big data is the key to everything. Here are four ways to improve how you use it

While most companies want to focus on the exciting bits, it's the infrastructure that matters. "I think it's almost like a bamboo tree; unless your roots are strong, your tree won't shoot up 90 feet. So for me, the focus on roots is super important," he says. When the foundation is right, you can then start to explore some of the interesting elements of data. During the past 12 months, for example, KFC has strengthened its own digital channels in response to the coronavirus pandemic. Traffic to the web app increased significantly through 2020 as click-and-collect and curb-side pick-up became more popular. ... "When the grape is cut from the vineyard, you don't have much time to make the fermentation process because the grape is degrading in the truck. So we have to move fast," he says. With brands such as Casillero del Diablo and Don Melchor, Concha y Toro operates in over 140 countries, making it one of the biggest wine companies in the world. Data is especially important at harvest time, when the company brings trucks with grapes from different parts of Chile to its wineries.


Four Technologies Disrupting Banking

Blockchain, or distributed ledger technology, has the potential to radically change who has control over our personally identifiable information (PII) and make financial institutions — and online transactions — much more trustworthy. Blockchain can help prove a person’s identity, allowing consumers to create a verified, digital identity they can use with any online institution. By leveraging public key cryptography and referencing a person’s verified credentials on a trustworthy, shared log (the distributed ledger), blockchain can help give people control over their digital identity credentials. Consumers could keep their identity credentials safe and use them as cryptographic evidence whenever their bank or another online business needs to verify their identity. They could also revoke access at any time. A blockchain infrastructure across the internet would give consumers a portable identity to use in digital channels and true control over their PII disclosure. This can help stop fraudulent payment transactions. Currently, if a transaction is disputed as fraud, there are few ways for a business to prove it is legitimate, which results in billions of dollars in losses annually due to chargebacks.


Email security is a human issue

Humans will inevitably make mistakes when it comes to phishing emails, but it is possible to mitigate these risks by ensuring that cyber defense strategies are at the front and center of business processes, as well as integrated within company culture. This will ensure teams are made aware of potential threats before they run the risk of falling victim to them. IT teams are often expected to take sole responsibility for a company’s cybersecurity strategy, yet it is impossible for these experts to monitor the email activity of each employee. With human error cited as a contributing factor in 95% of breaches, it is important to remember that email security – alongside many other areas of cyber defense – is a human issue and each member of the team poses a significant risk. While IT professionals should take the lead by distributing relevant information about the latest phishing campaigns targeting their industry, it is also the responsibility of managerial staff to flag IT concerns in their team meetings and integrate cybersecurity issues into regular company updates. These discussions can be started by IT leaders, but the topic of cybersecurity must be discussed by each department in order to ensure phishing emails do not fly under the radar.


Key Metrics to Track and Drive Your Agile Devops Maturity

Agile software delivery is a complex process that can hide significant inefficiencies and bottlenecks. Fortunately the process is easily measureable as there is a rich digital footprint in the tool-sets used across the process – from pre-development; development; integration & deployment; and out into live software management. However surfacing data from these myriad data sources and synthesising meaningful metrics that compare ‘apples with apples’ across complex Agile delivery environments is very tricky. Hence until recently, software delivery metrics have been much discussed but little used, until the arrival of Value Stream Management and BI solutions that enable the surfacing of accurate end-to-end software delivery metrics for the first time. ... Cycle Time is an ideal delivery metric for early stage practitioners. It simply measures the time taken to develop an increment of software. Unlike the more comprehensive measure of Lead Time, Cycle Time is easier to measure as it looks only at the time taken to take a ticket from the backlog, code and test that ticket – in preparation for integration and deployment to live.



Quote for the day:

"The litmus test for our success as Leaders is not how many people we are leading, but how many we are transforming into leaders" -- Kayode Fayemi