Daily Tech Digest - January 13, 2023

Poor cloud architecture and operations are killing cloud ROI

If the cloud did not ever have the potential to return ROI back to the business, nobody would use it. However, there are businesses that are very successful with cloud, even changing the business around the use of cloud computing. These companies are leveraging cloud as a true force multiplier to build innovative solutions, as well as to provide agility and scalability. However, many cannot find business value with cloud computing. Most disturbing, they are not finding value while spending about the same amount of money as those who are finding value. We must therefore conclude that bad decisions are being made. Cloud computing technology has been relevant for about 15 years. We understand it’s what you do and your company culture that makes you truly successful with cloud computing, not what you spend. Why are we still seeing winners and losers? ... First, bad architectures need to be fixed before they can operate properly. You can have a disciplined and highly automated operations team and technology stack, but if the solution is poorly designed, the result is going to be less than stellar, no matter what.


Innovation: Your solution for weathering uncertainty

Innovation has always been essential to long-term value creation and resilience because it creates countercyclical and noncyclical revenue streams. Paradoxically, making big innovation bets may now be safer than investing in incremental changes. Our long-standing research shows that innovation success rests on the mastery of eight essential practices. Five of these practices are particularly important today: resetting the aspiration based on the viability of current businesses, choosing the right portfolio of initiatives, discovering ways to differentiate value propositions and move into adjacencies, evolving business models, and extending efforts to include external partners. ... In times of disruption or deep uncertainty, companies have to carefully balance short-term innovations aimed at cost reductions and potential breakthrough bets. As customers’ demands change, overindexing on small product tweaks (that address needs which may be temporary) is unlikely to boost long-term performance. However, “renovations” to designs and processes can produce savings that help fund longer-term investments in innovations that may create routes to profitable growth.


The Truth About Cybersecurity Challenges Facing the Healthcare Industry

In general, healthcare IT has accrued technical debt for more than 25 years. Everywhere you look, whether it’s at the doctor’s office, hospital, or an urgent care facility, you see disparate and often dated IT systems. It’s not as rare as you’d think to see WindowsXP–based computers at the check-in desk and throughout the facility. Many of the most common pieces of equipment and attached computer systems run outdated operating systems, unpatched and archaic software, and have little security on them. I promise you it’s not for lack of trying by the IT and cyber-security team. So much outdated software exists largely because the vendors that support these systems focus on the healthcare aspect, rather than upkeep and security. In other instances, some devices were never intended to be connected to a network — thus rendering them vulnerable to remote attacks because they aren’t configured to be protected from network-based attackers. Finally, there is certainly some “if it ain’t broke, don’t fix it” mentality. Walking around you’ll find computer systems under people’s desks that have served a single purpose for a very long time. 


Time to Look at the Role of the CISO Differently

It is time to stop searching for non-existent profiles, expecting the CISO to be credible one day in front of the Board, the next in front of hackers, the third in front of developers, and all the way across the depth and breadth of the enterprise and its supply chain. Those profiles don’t exist anymore, given the transversal complexity cyber security has developed over the past two decades. The role of the CISO has to be one of a leader, structuring, organising, delegating and orchestrating work across their team and across the firm — and across the multiple third-parties involved in delivering or supporting the business. In essence, knowing what to do is reasonably well established and cyber security good practice — at large — still protects from most threats, and still ensures a degree of compliance with most regulations. But by focusing excessively on purely technical approaches to cyber security challenges, large organizations have failed to protect themselves effectively and efficiently, in spite of massive investments in that space over the last two decades.


MACH as an Enterprise Architecture strategy

MACH is an acronym for Microservices, API-first, Cloud-native, and Headless. It’s a modern approach for building and deploying software applications that can help organizations to be more agile, scalable, and flexible. In a MACH architecture, software applications are built as a collection of independent, self-contained microservices that communicate with each other through APIs (Application Programming Interfaces). The front-ends and back-ends components are separated and the entire solution is designed to be deployed in the cloud. ... There are several benefits of using a MACH architecture for building and deploying software applications:Agile development: MACH architectures allow different parts of an application to be developed and deployed independently, which can make it easier to make changes and updates without disrupting the entire system. This can help organizations be more agile and responsive to changing business needs. Scalability: MACH architectures are designed to be deployed in a cloud computing environment, which can provide the scalability and flexibility needed to support rapid growth or spikes in demand.


Maximizing data value while keeping it secure

Many organizations stumble and fail because they lack complete visibility into all data assets in clouds and beyond. To take visibility to a higher level, it’s vital to have a catalog of all managed and shadow assets, along with their owners, locations, security and governance measures enabled for the data. Without a central repository and a single view, there’s no way to know what data exists, how it’s stored, where it’s used and how it’s shared. Essentially, an organization winds up flying blind. Yet the advantages of robust discovery and visibility don’t stop there. With this information it’s possible to adapt and expand security profiles as needs and conditions change. ... Sharing data in the cloud involves complexity and risk. That’s a given. To maximize the opportunity—including harnessing the full functionality of cloud-native tools—an organization must know who is accessing data and how they are using it. Therefore, a robust identity management framework is crucial. Administrators and others must be able to analyze roles and permission settings in data assets that reside in clouds and across multi-cloud frameworks. 


Top automation pitfalls and how to avoid them

Automating a bad process can make things worse as it can magnify or exacerbate underlying issues, especially if humans are taken out of the loop. In some cases, a process is automated because the technology is there, even if automation isn’t required. For example, if a process occurs very rarely, or there’s a great deal of variation in the process, then the cost of setting up the automation, teaching it to handle every use case, and training employees how to use it may be more expensive and time-consuming than the old manual approach. And putting the entire decision into the hands of data scientists, who may be far removed from the actual work, can easily send a company down a dead end, or to end users who might not know how automation works, says James Matcher, intelligent automation leader at Ernst & Young. That recently happened at a company he worked with, a retail store chain with locations around the US. The retailer approached people on the front lines, and employees and managers working on the shop floors, for suggestions about manual processes that should be automated.


What’s the role of the CTO in digital transformation?

A CTO needs to take on the role of the ‘bridge builder’ between the strictly technical components of a transformation strategy and how they can apply to people and process in the specific context of an organisation. Digital transformation is a team activity. Each role needs to bring to the process their full insights and experience for the CTO to manage. The CTO has specific technological insight and therefore needs to be directly involved in helping the entire organisation identify where technical systems are simply obsolete and not fit for purpose so as well as being a bridge builder, CTOs naturally lead the charge when dealing with a technology-led approach. They must be able to explain where the value is in the application of technological change in context – too often we see visions that are de-contextualised from the reality on the ground. This kind of technological planning does not allow for realistic strategic planning. With visions of the ambitious but feasible in sight it is then the whole leadership team’s task to decide what course they are going to map out and to work together on the digital transformation journey.


How Organizations Should Respond to the CircleCI Security Incident

CircleCI has taken proactive steps to mitigate risk for its customers, but simply revoking secrets from the platform is not enough, according to Jaime Blasco, co-founder and CTO of cybersecurity company Nudge Security. “It’s still important to assume that every connected application and secret has been compromised. Organizations should verify the steps that these vendors have taken and also take steps to rotate secrets within any other connected application,” he explains. Customers can leverage commercially available or open-source tools, aside from the one offered by CircleCI, to discover their secrets. “One option is to use Trufflehog, an open-source tool that scans for secrets across multiple platforms, including CircleCI, Github, Gitlab, and AWS S3,” says Blasco. CircleCI is assuming responsibility and taking steps to protect its customers, Assaf Morag, lead data analyst at cloud native security company Aqua Security, notes. But is important for customers to respond proactively to the security incident as well. 


Artificial intelligence in strategy

Every business probably has some opportunity to use AI more than it does today. The first thing to look at is the availability of data. Do you have performance data that can be organized in a systematic way? Companies that have deep data on their portfolios down to business line, SKU, inventory, and raw ingredients have the biggest opportunities to use machines to gain granular insights that humans could not. Companies whose strategies rely on a few big decisions with limited data would get less from AI. Likewise, those facing a lot of volatility and vulnerability to external events would benefit less than companies with controlled and systematic portfolios, although they could deploy AI to better predict those external events and identify what they can and cannot control. Third, the velocity of decisions matters. Most companies develop strategies every three to five years, which then become annual budgets. If you think about strategy in that way, the role of AI is relatively limited other than potentially accelerating analyses that are inputs into the strategy. 



Quote for the day:

"Effective questioning brings insight, which fuels curiosity, which cultivates wisdom." -- Chip Bell

Daily Tech Digest - January 12, 2023

Agritech forces gain ground across Africa

One of the crucial issues that agriculture in Africa is currently solving, according to Gaddas, is a lack of water. He says that in Senegal, Tunisia and many other countries, companies are working hard on intelligent irrigation, and on how to optimize water resources that are becoming increasingly scarce, especially in the context of climate change and unpredictable rainfall. “Managing water is becoming crucial,” he says. “We’ve met start-ups that use drones, which, through their precision devices, help to collect data that can be used by farmers, such as the levels of nitrogen from the fields, precise mapping of areas with fertiliser deficits,and others that solve plant disease problems by making diagnoses. There are also ERP systems for farm management and to know what is happening in real time—the management of inputs, fertilizers and more.” He also appreciated the digital aquaculture companies that allow for very rational management of aquaculture farms, while praising the impressive diversity of solutions. “The diversity of problems that farmers face in Africa is very wide but creativity is not the weak point of Africans,” he says. 


Ushering in an era of pervasive intelligence, powered by 6G

The impact that this new era will have cannot be understated. It will power economies, drive sector convergence, enable the distributed infrastructure behind Web 3.0 and scale and interconnect metaverses. Put simply, it will transform all aspects of life. But getting there isn’t straightforward, and we need to act now to lay the foundations that are necessary if we are to harness its power. This on its own is not the most straightforward undertaking, as is evidenced by the issues with the 5G+ rollout and adoption. The right infrastructure and business models were not in place, which led to delays and innovative potential left on the table. Let’s learn from past mistakes, course correct and ensure we’re ready for the future of pervasive intelligence. ... Transformation into the pervasive intelligence era will first require the establishment of a high performance, integrated ecosystem made up of a range of partners from different industries and sectors. This is critical as pervasive intelligence will only be reached in an environment where data and information can move freely and securely. This, however, cannot happen if companies operate in silos or in isolation.


DeFi Labs Revolutionises Decentralized Finance by Leveraging AI

According to the co-founder of DeFiLabs, “With our AI-powered yield farm, we’re introducing a new level of innovation to the DeFi space. We’re making it possible for users of all levels to earn high returns on their investments, while also minimizing risk. Our goal is to provide our users with the best investment opportunities available in the DeFi space, and our AI-powered yield farm is just the beginning. We’re excited to see how our users will benefit from this new offering.” This launch is also a significant step for the Binance Smart Chain ecosystem, as it showcases the capabilities and the potential for growth of Binance Smart Chain. This yield farm will encourage the usage of the Binance Smart Chain and drive the adoption of DeFi on this network. The yield farm is live and fully operational, and users can start staking their Binance Coin (BNB) or other supported tokens to earn high returns on their investments. The DeFiLabs team is constantly working to add new features, tokens, and investment options to the yield farm, making it even more valuable for users.


The importance of collaboration in maximising cybersecurity

The CISO has a vital role within companies, and one which is currently evolving. Beyond technical knowledge, one of the most important aspects of the CISO’s role in an enterprise is collaboration. Information, security and data protection controls permeates all levels and departments of a company, not just limited to tech. As such, it is important to relay technical information succinctly to all relevant directors and parties, ensuring all teams are adequately equipped to manage cyber risks. There is a wide range of cybersecurity services that can be adopted. This includes perimeter and cloud security, device security, network security, threat hunting, DevSecOps, and web and mobile application security. To make them all function, and operate as tightly as possible, you must work with a team of experts, to ensure that your company is at the forefront of new advances in cybersecurity. The removal of silos is therefore integral to ensuring companies are prepared and equipped to defend themselves against cyber-attacks.


IT supply issues have organizations shifting from just-in-time to just-in-case buying

One thing more enterprises should be looking for is greater visibility from their suppliers. "A lot of people are realizing that we're living in a more transparent world now," said Genpact's Waite. And integration between companies has increased, with some providers offering more information to their customers. ... With this approach, vendors are selected not just based on technical fit, form, and function but also based on where in the world they source their materials, or how big of a company they are. Supply chain visibility is particularly important for manufacturers. They need to know if the supplies they need are on track, or if alternate sources have to be found in order to avoid production delays. "Our supply chain is built entirely on transparency," says Carl Nothnagel, COO at specialty hardware manufacturer MBX Systems. "With every supplier, we push for that information. Sometimes we don’t get it, and we’re left with projecting, or guessing as best as we can. We have some manufacturers that are very transparent and we can see where it's going to hit every day, and some are a bit of black hole."


6 Data Governance Principles Corporate Leaders Should Apply in 2023

The success of your data governance plan depends on what your employees do with the data they handle. Therefore, once you’ve created a data governance plan, you should share it with your employees. Successful data governance requires a holistic, organization-wide approach that demands transparency across your organization. You can further demonstrate internal transparency by documenting all data governance decisions and actions. This documentation can help you learn from past mistakes and protect your corporation if you experience a data breach, lawsuit, investigation, or other regulatory action. ... Responsibility and accountability are integral parts of any corporation’s data governance processes. Traditionally, your information technology (IT) department would be responsible for managing your corporation’s data. But now that most—if not all—of your employees deal with data on a daily basis, employees throughout your organization must see themselves as the stewards of your data. So, who is responsible for what data? That is something you will need to decide. 


Study shows attackers can use ChatGPT to significantly enhance phishing and BEC scams

The more complex and long a phishing message is, the more likely it is that attackers will make grammatical errors or include weird phrasing that careful readers will pick up on and become suspicious. With messages generated by ChatGPT, this line of defense that relies on user observation is easily defeated at least as far as the correctness of the text is concerned. Detecting that a message was written by an AI model is not impossible and researchers are already working on such tools. While these might work with current models and be useful in some scenarios, such as schools detecting AI-generated essays submitted by students, it's hard to see how they can be applied for email filtering because people are already using such models to write business emails and simplify their work. "The problem is that people will probably use these large language models to write benign content as well," WithSecure Intelligence Researcher Andy Patel tells CSO. ... Attackers can take it much further than writing simple phishing lures. They can generate entire email chains between different people to add credibility to their scam.


Insights on Nordic artificial intelligence strategies

The Nordics are generally early adopters of technology – and AI is no exception. More than 25% of the Nordic companies are already investing at least 20% of their research and development budget in AI projects. Moreover, the Nordic countries are planning to get ahead – or at least keep up with other industrial nations. Each of the four countries have at least one top-ranking AI-related educational institution – and private investment in AI has more than doubled in the region since 2021. ... Finnish AI research runs primarily along three different dimensions. The first is to optimise the performance of AI algorithms to head off the problem where computational requirements get too far ahead of what hardware can deliver. As a small country, Finland is particularly sensitive to the increasing costs of computational power – even though they house what is currently Europe’s most powerful supercomputer, LUMI. The second dimension is trustworthy AI. Ethics and values are important to Finland, as they are in all other Nordic countries. Research in trustworthy AI aims to overcome the complex ethical challenges inherent to AI.


Structured Data Management for Discovery and Insight

Polanco says the chief data officer, chief compliance officer, and CISO should collaborate on finding an effective structured data management practice that provides a well-governed, fully-compliant data architecture that connects data sources for data consumers. “Data must be findable, accessible, interoperable, and re-usable for [data] consumers, while also ensuring compliance with data quality standards and data security and privacy measures,” he adds. Anyone in a managerial position who encounters data will likely have considered best practices for data management already. “While those managers may be responsible for implementing data management resources for their respective teams, the initial solution can come from technology companies that weld together the manual knowledge of what the data needs to look like and the efficiency of a more automated sorting process,” Polanco says. Macosky adds that while the chief data officer position is fairly new across industries, he expects to see the role become more important and vital as organizations prioritize and value data management.


How to Measure the Energy Consumption of Bugs

It is very important to always have the underlying architecture and communication to all the connected services in mind. Often it may seem that a bug does not affect energy consumption at first sight. This impression can quickly change when the broader context of the feature where it occurs is taken into account. A QA engineer needs to understand communication between the services, how it is implemented (in collaboration with the developers), when it takes place, where it initiated, and where the services and features run. In practice this means that QA engineers who want to measure the energetic impact of their product in more detail must not only understand the customers’ perspective (as usual), but in addition many implementation details from different perspectives. Where do particular services run? On which infrastructure? Which libraries are used? How can the implementation of the product be modified in order to measure energy consumption. Improvement of energy consumption is not something that can be activated by just pushing a button. 



Quote for the day:

"If you don't demonstrate leadership character, your skills and your results will be discounted, if not dismissed." -- Mark Miller

Daily Tech Digest - January 11, 2023

WSL stands for writing as a second language. ... Whatever the intention, WSL leads to an overall tone that adds distance between the writer and the reader. And that is precisely the opposite of what is needed now from leaders. If there are fewer opportunities to hear leaders speak in person because so many of us are working from home, then we need to “hear” them speak in their emails. A more conversational writing tone shortens the distance between author and audience. It feels more real, which is what everyone craves at a time when we are living more of our lives online. To guard against WSL, just apply this simple test when reviewing what you’ve written: Does this sound like me? Would I talk like this if I were speaking face-to-face with a colleague? Reading aloud is a good way to check for the WSL problem (especially if, as a leader, someone else is writing the words for you). ... “Expert-itis” happens when people get too close to their subject. They assume everyone else knows as much as they do, so they focus on the nuances of a particular topic or insight without explaining the context.


Attackers Are Already Exploiting ChatGPT to Write Malicious Code

Sergey Shykevich reiterates that with ChatGPT, a malicious actor needs to have no coding experience to write malware: "You should just know what functionality the malware — or any program — should have. ChatGTP will write the code for you that will execute the required functionality." Thus, "the short-term concern is definitely about ChatGPT allowing low-skilled cybercriminals to develop malware," Shykevich says. "In the longer term, I assume that also more sophisticated cybercriminals will adopt ChatGPT to improve the efficiency of their activity, or to address different gaps they may have." From an attacker’s perspective, code-generating AI systems allow malicious actors to easily bridge any skills gap they might have by serving as a sort of translator between languages, added Brad Hong, customer success manager at Horizon3ai. Such tools provide an on-demand means of creating templates of code relevant to an attacker's objectives and cuts down on the need for them to search through developer sites such as Stack Overflow and Git, Hong said in an emailed statement to Dark Reading.


Cybersecurity staff are struggling. Here's how to support them better

Cybersecurity professionals are at breaking point, with many fearing they will soon lose their jobs because of a cyberattack and others struggling to cope with the growing strain. Unless businesses act soon, an ever-growing skills gap might become an unbridgeable chasm. ... "Cyber used to be very much off in a darkened room," she says. "And don't get me wrong, there's loads of stuff relating to IT security that people in security still have to do. But you need to be thinking about cyber at the heart of every business process and everything that you do within an organization." And cyber isn't a one-way street -- as well as ensuring the people in security feel part of the broader enterprise, Heneghan says line-of-business professionals must also learn about cyber concerns themselves. Success requires a joined-up approach, where business and security come together and recognize how information integrity isn't just one team's -- or even one person's -- responsibility. "It's about building the fundamental foundation," she says. "It's not acceptable for anyone in an organization not to understand the exposure and the risks around security anymore."


FTC Is Escalating Scrutiny of Dark Patterns, Children’s Privacy

The FTC has publicly identified dark patterns as an enforcement priority. In September 2022, the FTC released a report summarizing concerns that companies are increasingly using sophisticated design practices, known as dark patterns, to trick or manipulate consumers into buying products or services or provide their personal data. The report reflects the FTC’s findings that dark patterns are used in a variety of industries and contexts, including e-commerce, cookie consent banners, children’s apps, and subscription sales. Unlike neutral interfaces, dark patterns often take advantage of consumers’ cognitive biases to steer their conduct or delay access to information needed to make fully informed decisions. The FTC’s research noted that dark patterns are highly effective at influencing consumer behavior. Dark patterns include disguising ads to look like independent content, making it difficult for consumers to cancel subscriptions or charges, burying key terms or junk fees, and tricking consumers into sharing their data. Because dark patterns are covert or otherwise deceptive, many consumers don’t realize they are being manipulated or misled.


8 top priorities for CIOs in 2023

Over the past decade, enterprises have rapidly added powerful technology and cloud-based services to their portfolios. At the same time, they have been much less likely to retire the legacy systems these new tools were meant to replace, creating a complex web of redundant applications and systems, warns VMware CIO Jason Conyard. There’s an industry-wide push to reduce technical and data debt and reallocate those resources toward building the future, Conyard says. “CIOs will be looking to rationalize their technology estate to reduce unnecessary cost and maintenance, and to minimize their security attack surface and privacy exposure.” ... There must be open, transparent, and collaborative working sessions to create alignment on how technology capabilities can be deployed to meet enterprise goals, states Bill Cassidy, CIO at New York Life Insurance. “All participants need to demonstrate strong communication skills, including effective listening, to properly weigh the pros, cons, and tradeoffs of one path of execution versus another,” he adds. ... Organizations that can successfully act on their data insights will thrive, says Dan Krantz, CIO of electronics test and measurement equipment manufacturer Keysight Technologies. 


Learning From Other People’s Mistakes

One prerequisite to this consolidation of wisdom is the need for information sharing. Information about what works and what does not work is needed to enact controls in an environment that help prevent certain events from happening twice. This can be accomplished in several ways. Using organizations such as ISACA® to stay connected to peers working at other enterprises helps professionals converse about relevant topics. But information sharing goes beyond merely discussing what you are working on and how you are solving control problems. There is also a need to discuss what went wrong. This means sharing information about what failed and why. This is hard for several reasons, not the least of which is that it is embarrassing to admit to failure. However, there can also be legal impacts of admitting that something went wrong and that as a result services, people’s data, or even their lives were endangered. ... In short, not all cyberincidents can be attributed to sophisticated nation-state hackers leveraging advanced persistent threats (APTs), phrases such as “we are taking it seriously” notwithstanding.


Developer experience will take center stage in 2023

In order for software companies to win and retain top developer talent, they must be able to provide a great developer experience. To do that, tech leaders must prioritize minimizing toil and frustration in the software development process. Software development is a highly creative process, but is often rampant with bottlenecks and inefficiencies that disrupt creative flow. By minimizing bottlenecks like idle time waiting for build and test feedback cycles to complete and inefficient troubleshooting, software development teams will improve productivity while increasing developer happiness. Especially given the uncertain economic outlook, now is the time for companies to focus on solidifying their software development team and upgrading their talent pool. As a result, there will be a greater emphasis on tools that boost productivity so developers can spend more time innovating and creating useful code. This is the best way to attract and retain top talent. When you ask many software development leaders what their average feedback cycle time is, they usually don’t have an answer. 


What Are the Advantages of Quantum Computing?

At their core, quantum computers manipulate subatomic particles, making them ideal for atomic and molecular scale research and development. “It can help us solve physics problems where quantum machines and the interrelation of materials or properties are important,” Mark Potter, SVP and CTO of Hewlett Packard Enterprise and director of Hewlett Packard Labs, explained in an interview with ITPro in late 2019. “At an atomic level, quantum computing simulates nature and therefore could help us find new materials or identify new chemical compounds for drug discovery.” Quantum technology is also having an out-sized impact on logistics management and route planning. For example, grocery chain Save-On-Foods is using quantum computing to optimize their logistics to become more efficient, save money, and bring fresh food to their customers. Specifically, they were able to reduce the computation time of an optimization task down from 25 hours to only 2 minutes. Another major area of interest is quantum cryptography, which, depending who you ask, is either a major advantage or a cause for concern. 



CISOs Mark Data Proliferation as Growing Security Problem

Claude Mandy, chief evangelist of data security at Symmetry Systems, says data sprawl is a headache for security teams because they have historically designed their security to protect the systems and networks that data is stored or transmitted on, but not the data. “As data proliferates outside of these secured environments, they have realized their security is no longer adequate,” he says. “This is particularly concerning when the traditional perimeter that provided some comfort has all but disappeared as organizations have moved to the cloud.” ... In the new era of data security, CISOs must have the ability to learn where sensitive data is anywhere in the cloud environment, who can access these data, and their security posture and deploy these solutions. “Traditionally, data security has been the ultimate goal of infosec organizations,” says Ravi Ithal, Normalyze CTO and cofounder. “As the volume of data increases and the number of places where data exists increases -- data proliferation -- the number of ways in which it can be accessed and misused also increases. 


4 key shifts in the breach and attack simulation (BAS) market

First, they require up-front configuration for their on-site deployments, which may also require customizations to ensure everything works properly with the integrations. Additionally, BAS solutions need to be proactively maintained, and for enterprise environments this often requires dedicated staff. As a result, we’ll see BAS vendors work harder to streamline their product deployments to help reduce the overhead cost for their customers through methods such as providing more SaaS-based offerings. Many BAS tools are designed to conduct automated security control validation. Most have an extensive library of automation modules that can simulate specific threats and malicious behaviors on endpoints, networks, or cloud platforms. BAS vendors tend to compete in the market this way. However, many vendors don’t offer the ability to create or customize modules in a meaningful way. For example, some don’t provide the user with a way to chain attack procedures together, which can be essential when trying to simulate an emerging threat that uses common tactics, techniques, and procedures



Quote for the day:

"A leader is someone people respond to, trust and want to work with." -- @ShawnUpchurch

Daily Tech Digest - January 10, 2023

Did AI blow up your cloud bill?

Despite its substantially lower operating costs and the potential value that AI and machine learning can bring to a business, the return falls short in many cases. 2022 was a year of huge cloud cost overruns. An enterprise’s misuse of cloud resources in general creates most cloud cost overruns. In some cases, this means choosing cloud AI/ML systems when more pragmatic alternatives could return more value. Many AI/ML systems are much more expensive to maintain. Specialized skills are needed to build and deploy these systems and then to operate them. “Cloud AI” just means that the processing and data storage are outside of the enterprise. Massive amounts of general purpose and purpose-built data are needed to drive AI engines, and that data must be stored, managed, and secured ongoing. You must also deal with data compliance. In many cases, the business has custom needs that require custom training data that isn’t part of the general-purpose transactional business database but is a one-off to support a specific need of the AI system. That means more storage, more labeling, more streaming, and more operational costs.


India’s Digital Personal Data Protection Bill: What works, what it lacks

“In areas such as itemized notice, a clarity is needed in its role in the consent,” said Vinayak Godse, CEO of the Data Security Council of India. In addition, the bill does not specificy when consent is needed from the user — for example, whether consent is required for collection of information for internal use, or only when data is being sold to third parties. “Industry is cautiously examining the idea of consent manager, a solution suggested in the draft,” Godse said. There is also confusion regarding the issue of deemed consent. The first clause in the bill regarding deemed consent says that it occurs “when a user voluntarily provides data and it is reasonably expected that she would provide such personal data.” ... The second clause pertaining to deemed consent, however, is much broader, stating that deemed consent would be considered “for the performance of any function under any law, or the provision of any service or benefit to the Data Principal, or the issuance of any certificate, license, or permit for any action or activity of the Data Principal, by the State or any instrumentality of the State.”


Outsourcing’s dark side: How to stop the surge of supply chain attacks

Alongside lags in investment, many organizations’ cybersecurity programs have fallen behind. Adequate action isn’t taken to secure remote access, which leads to far too many third parties accessing internal networks with zero oversight. A full 70% of organizations surveyed reported that a third-party breach came from granting too much access. But, half don’t monitor access at all — even for sensitive and confidential data — and only 36% document access by all parties. They simply take a “hope it doesn’t happen” approach, relying on contracts with vendors and suppliers to manage risk. In fact, most organizations say they trust third parties with their information based on business reputation alone. However, hope and blind trust are not strategies. Many bad actors play a long game. Just because vendors aren’t breaking your systems now doesn’t mean hackers aren’t involved in malicious activity undetected, gathering intel and studying workflows for a later time. Not all companies have ignored threats. The healthcare industry has become a leader in solving third-party security issues because of the need to comply with audits by regulatory bodies.


Three Essential Data Integrity Trends to Watch in 2023

As challenges with data quality increase, companies will start to become more agile in their approach to ensuring the health of their data. We are already seeing that many businesses are turning to data observability to proactively discover issues impacting the reliability of data – meaning bad data can be stopped in its tracks before it has a chance to negatively impact decision-making. The best data observability tools use advanced technology to apply machine learning intelligence, watching for patterns in enterprise data and alerting data stewards whenever anomalies crop up. It enables business users to proactively address potential problems as they happen, resulting in healthier data pipelines, more productive teams, and happier customers. Data governance has historically been viewed as a necessary tool to help drive regulatory compliance or policy management. But as organizations increasingly seek to drive advanced analytics initiatives, and get more sophisticated in how they derive business insights, the role of data governance has evolved significantly. 


HR trends in 2023 that will push the envelope even further

We see more and more businesses promoting from within and providing more internal employee training programs and opportunities. This is a result of an attempt to retain workers at a higher rate and is in part a response to a decreasing talent pool. According to a Springer report, nearly 50% of the world's workforce will require retraining or upskilling within the next five years. Modern job roles are increasingly replacing conventional job responsibilities. Organisations earlier used to place a lot of emphasis on degrees to evaluate candidates' qualifications. While academic credentials are still important, many recruiters have begun looking at particular skill sets instead. The key to technological progress is reskilling and upskilling workers in areas of need, including artificial intelligence (AI), machine learning (ML), DevOps, etc. Employers must identify the talents they require, communicate those needs, and create systems for attracting, retaining, and training new employees. In 2023, we foresee a few more trends that will demand our attention. 


5 GraphQL Trends to Watch in 2023

GraphQL helps expose fields in an easily consumable manner. But it’s also good for aggregating disparate microservices into a unified schema. As such, it is often used as a meta layer that combines multiple REST services, databases and even GraphQL schemas. For the time being, it appears that REST has staying power. And it’s unnecessary to migrate all REST services to GraphQL. Looking to the future, it is more likely to be adopted as a layer on top of existing REST APIs. Combining multiple backend services under the hood and exposing them in a unified schema creates a more usable, localized interface that frontend developers can work from. In the future, GraphQL might become a meta layer for more organizations, increasing the discovery of internal services and helping teams reuse internal microservices. ... However, there’s a downside to centralizing with a unified graph. Different internal teams or partners might only need access to a particular set of functions. Giving the entire schema to every person who walks through the door would be TMI, not to mention a potential security risk that breaks the rule of least privilege. So, the idea of a subgraph has gained traction. 


Data Science vs Software Engineering: Do You Know the Difference?

Data Science and Software engineering has too many qualities in common there is a hell lot of confusion regarding where one end and where the other starts invoking a typical data science Vs Software engineering haze. ... To understand what data science Vs software engineering is all about, to their nitty-gritty, one should get to know what they have in common. Going by the pace at which the software sector is growing, it is pretty much evident that there is an urgent need for the development of digital technology. SaaS, a sector that is playing a crucial role in delivering critical software services to companies, has become the quickest-growing sector. The growth of services like cloud computing technologies, open source, programming services, and systems services has aided to a great extent in the development of advanced technologies like machine learning, artificial intelligence, and computer vision which depend on data and data analytics. Notwithstanding the dependencies, data science and software programming share a few stark differences.


How Will Tech Jobs Change in 2023?

In 2023, people could actually start understanding what hybrid means, said Pilar Orti, director of Virtual not Distant, which offers training for managers of remote teams. “All it means is you are not all day in the office,” Orti told The New Stack. “Doesn’t tell you anything about the culture,” including if it embraces flexibility and autonomy or is still very office-based. Right now, Orti thinks most companies are throwing the word “hybrid” around to mean the separation of when you want to be social at the office, and when you want to really focus at home. Jānis Dirveiks, an Agile coach, has found flexiwork red flags in job descriptions where companies want to rank for terms like remote and hybrid, but don’t offer the culture to back it up. These tags only refer to physical location, Orti agreed. “What’s not quite there is the schedule flexibility. Most organizations are still hung up that certain things can only be done when you’re together, whether in the same space or online.” This uncertainty will only emphasize the tension and divide she witnesses in larger organizations between who can work from home and who cannot.


Top 9 challenges IT leaders will face in 2023

Gil Westrich’s company, ClearML, is benefiting from increased adoption of artificial intelligence and machine learning (ML) technology. But the CTO and co-founder says that scaling to meet that demand presents its own challenges, which require self-reflection. “How do we grow our business responsibly?” he asks. “How do we get the talent we need? Given how competitive the technology and associated talent market is, companies have to clearly map out their plans for business growth, ensuring they are as comprehensive, considered, and responsible as possible. We want to make sure our roadmap is robust.” Tyler Derr, chief technology officer at Broadridge Financial Solutions, is also concerned about how to address challenges that come with growth. “Recent market volatility has added to the complexity of delivering high-quality products and services,” Derr says. “Organizations will need to consider when and how to invest in proprietary innovation or partner with industry-outsourced solutions balancing cost and speed to market, while bringing to life a robust digital roadmap with clear and actionable objectives and key results.”


Want to be a data leader? Here are 8 attributes you’ll need

Data is worth nothing if it’s not informing decision-making, which requires somebody to put it into a relevant context for the business. To identify context, data must be applied to the circumstances at any given point in time for all of those who need it. This involves embedding necessary tools and applications, such as business intelligence (BI), into the most popular applications used by all employees. That way, data finds them – not the other way around. “By doing this, employees can do their best work with optimal information anytime, anywhere and on any device,” says Brian Gentile, chairman of the board at cloud data vendor Matillion, “to help them make those all important business decisions that may impact on the organisation’s future.” Many data crunchers succumb to the pressure of hitting this month’s numbers, which ignores the bigger picture and can negatively impact the longer-term customer experience. A true data leader will show an institutional commitment to balancing quick, lightweight testing with longer-term longitudinal studies.



Quote for the day:

"People will not change their minds but they will make new decisions based upon new information." -- Orrin Woodward

Daily Tech Digest - January 09, 2023

Making Data Proactive to Manage Risk

Using advanced tools and technology, statistical sampling of data offers a better approach than archaic data indexing. Data sampling, via AI-based segmentation, meets the demands of regulators and other authorities. By statistically sampling data, it can be made more proactive to facilitate risk management with a methodical approach. AI-based segmentation or sampling collects representative data and allows for the development of models across a wide range of data classification use cases. Such sampling provides a way to statistically home in on the vulnerable areas within your data fabric and iteratively test for risk and compliance. It is efficient and effective means of reducing overall risk, in less time and at less cost. The Census Bureau uses sampling by using selective demographic records and drawing inferences on a population. Manufacturers have used statistical quality control to selectively sample widgets to determine, with confidence, that the quality of a population of manufactured parts met specification, without checking every single widget.


It’s official: Digital trust really matters to everyone online

Given that almost everyone surveyed indicated their appreciation of digital trust, it is not surprising that it is considered a priority. The typical respondent commenced work on digital trust two to three years ago, making 75% (or more) progress so far with expectations of reaching complete trust within the next one to two years. Arguably, however, digital trust is more of a journey than a destination, given the constantly shifting nature of the threat environment. The fear of customer attrition directly translates into a digital trust goal: customer loyalty. Other goals (which contribute to customer loyalty) include reducing security issues, meeting regulatory, legal and compliance obligations with reduced cost, and improving brand perception. Achieving digital trust improvements isn’t without its challenges. Topping the list of obstacles is managing digital certificates, rated as important by 100% of enterprises. Regulatory compliance and handling the scope of what they are protecting was a close second at 99%. Complexity rounds out the difficulties faced (securing dynamic, multi-vendor networks isn’t easy), while a lack of staff expertise is also cited.


Turning Points in AI and ML

When one couples the technology capabilities of AI/ML solutions with the practical ends they may evolve to serve, the potential examples multiply quickly. Think about the job of content moderator in a social media environment, which requires not only discrete judgment, but also attention to detail at tremendous speed and volume. Today, bots can screen text for certain terms and expressions. But tomorrow’s may be able to monitor sentiment, understand contexts outside the immediate content, or evaluate non-text expressions, like videos or photos. In a contact center environment, you can “trip up” a present-day bot by trying to stray from its trained area of knowledge. That’s because today’s conversational bots learn from other conversations. Tomorrow’s will learn from the world of unstructured inputs around them, as people do, so they’ll keep up with you no matter where you steer the conversation. Natural language understanding and large language models will translate that into unfettered interaction and follow up with required action that’s instantly and automatically informed by relevant data wherever it resides.


Developers Should Celebrate Software Development Being Hard

Software development is too hard for people to do who don’t enjoy software development. This is good news because it means anyone not committed will give up because they will find it too difficult to coast in. Developers are put under intense pressure to deliver quickly and to constantly learn new technology. Developers who don’t enjoy software development will give up because it's too hard. The difficulty of software development reduces the number of developers, despite it being a well-paid job with good career options. The number of good developers is small because it's easier to be a bad developer than a good developer. ... If you are a developer who enjoys software development, take a moment to thank the software gods. Low code software development will increase the number of developers, but it won’t increase the number of good developers because most people don’t understand how software development works. The harder development is, the fewer developers who can do it well. This makes those developers valuable.


3 Lessons Entrepreneurs Can Learn From The Rise and Fall of History's Biggest Companies

The second lesson is that entrepreneurs can still beat out larger companies even if they lack the same connections to power. History shows that "right" can often beat "might." ... Energetic commitment and talent will beat resource-rich rivals, as long as entrepreneurs pick their fights wisely. There are two reliable ways of spotting opportunities to do so. First, as companies get bigger, even well-managed ones must leave opportunities on the table — market segments or product opportunities too small or too different for them to do well in or focus on. These often provide windows of opportunity for small players. Today's small markets can become tomorrow's large markets. ... Finally, in assessing today's large companies, it's important to remember that their success usually came from a basic entrepreneurial achievement combined with an organizational mindset. As entrepreneurs grow their businesses, they should be mindful of the competencies they have developed and remain intent on building new ones over time. New competencies — fueled by innovation — will likely increase their trajectory in growth and value.


Is Your Wi-Fi Router in the Wrong Spot? Here's How to Tell

Routers send the signal out in all directions, so if it's left in the corner of your home, a significant percentage of your wireless coverage is being sent outside your home. That's why your best bet is to move the router to a central location to optimize the signal. Installing a router across the house from the modem may prove troublesome. It may require manually running an especially long CAT5 or CAT6 Ethernet cable under the floor or along the bottom of your walls, or enlisting the help of power line network adapters, which use your home's electrical wiring to pass an internet signal from point to point. But the improved wireless coverage will be worth it. ... Routers tend to spread their strongest signals downward, so it's best to mount the router as high as possible to maximize coverage. Try placing it high on a bookshelf or mounting it on the wall in an inconspicuous place. Search online, and you'll find lots of custom wall mounts built for specific routers, like this stick-up mount for the Eero Pro 6 mesh router. If you're struggling to find a good, elevated spot, something like that could be a great solution.
Avoid other electronics


Why data engineers are key to becoming a true digital leader

A common misconception among enterprise business leaders is that their data-driven ambitions will be realised by hiring data scientists. Data scientists are, of course, a crucial part of a data-driven business. Their ability to unearth interesting and unusual data patterns, and develop predictive and analytical models, helps to discover new solutions that can lead to positive outcomes such as cost-saving. However, data scientists are not purely driven by business goals. Instead, they are motivated by experimentation. If not managed appropriately, this can hinder data projects as data scientists search for solutions that the business may not want to implement. Data engineers, on the other hand, are responsible for translating data insights into technical and data requirements to directly meet business objectives. Unlike data scientists, data engineers are firmly focused on driving a business’s overall data strategy forward. This can include assisting with the performance of analytics projects, authorising data for different audiences, and ensuring data governance for regulation compliance.


UN to Hold Hearing on Proposed Cybercrime Treaty

The proposed treaty identifies cybercrime as any "criminal offenses committed intentionally and illegally" over IT devices. It further lists a variety of criminal activities that are deemed illegal. These include activities such as illegal access, network interference, tampering of hardware or software that results in the compromise of critical infrastructure or leak of confidential government data, among others. By identifying and clearly defining what constitutes various cybercrimes, the treaty seeks to provide a legal basis for states to prosecute potential offenders, which may range from sophisticated nation-state hackers to employees who illegally access sensitive government data. The draft proposal further spells out measures that states can adopt such as granting law enforcement agencies the right to collect data in real time, seize devices used for cybercrime activities, coordinate with intelligence agencies, as well as assist victims of various cybercrimes.


We’re a Long Way From a Passwordless Reality

A passwordless future sounds fantastic, but passwords will remain a backup authentication method until passwordless technologies mature. Few websites are currently compatible with passwordless authentication. The majority of websites will need to continue to store passwords because decades will pass before every user has the hardware and software they need to use passwordless authentication. Even in the Windows-advertised world of passwordless setup passwords still matter, particularly as a backup method. The latest Windows release breaks the Windows Hello biometrics and PIN setup that users already count on for passwordless authentication. Last September, Microsoft said commercial users of Microsoft apps and services, such as Outlook, OneDrive, and Microsoft Family Safety, could remove the password from their Microsoft accounts entirely. But, Windows 11 release version 22H2 breaks the Windows Hello authentication technology. Windows users can experience Windows Hello sign-on failures with face recognition, fingerprints, and PINs. 


How to improve your incident response plan for 2023

Many organizations are confident in the existence of their incident response plan (IRP), but they are often not entirely sure what to do with it. A threat-specific IR playbook can offer easily accessible guidance during the chaos of incident response and is a vital element of an IR plan. When a cybersecurity incident happens, you’re scrambling to understand what’s being harmed or stolen, and not knowing where to begin can exacerbate the damage. Using your playbook as a set of IR standard operating procedures (SOPs) can define the roles and responsibilities of each IR team member as well as other key stakeholders within the company and keep everyone on the same page. For instance, the IR team will determine that passwords need to be changed during the containment of a ransomware incident, but to understand which passwords need to be changed or other required actions, they could consult the playbook for quicker resolution. These incidents call for “all hands on deck”, but for things to run smoothly, everyone must know their individual roles and the roles of others, including who the critical point of contact for each workstream is.



Quote for the day:

“A real entrepreneur is somebody who has no safety net underneath them.” -- Henry Kravis

Daily Tech Digest - January 08, 2023

The best robots and AI innovations at CES 2023

Advancements in autonomous driving haven't developed as quickly as some imagined they would. However, cars are incrementally becoming smarter and smarter, with autonomy seemingly just over the horizon. Case in point: Peugeot, Stellantis's French automobile brand, unveiled the Inception Concept car, an electric vehicle demonstrating what a car can be, once you do away with the steering wheel, get comfortable and let the vehicle get to know you a little. ... While autonomous cars are still in development, other market-ready autonomous mobility tools were on display on CES. The company Evar was at the conference with Parky, an autonomous EV recharging robot that brings a charging station to any parking spot. It's designed for building owners that want to make their parking spots more EV-friendly without adding electric capacity. ... The Withings U-Scan toilet bowl sensor attaches to the inside of your toilet's bowl. It includes a nutrition and metabolic urine tracker, checks pH, ketone, vitamin C levels, and more. A second tracker monitors women's luteinizing hormone for ovulation cycles. Each promise early detection of potential health issues.


Math Behind Software and Queueing Theory

Unsurprisingly, queueing theory is a branch of mathematics, focused on studying and describing queues (or, in more professional terms, lines). The whole theory is all about how lines are created, how they behave, and, most important, how and why they malfunction. It is one of these branches of mathematics which are useful in real life; e.g., it can be used in many branches of industry. ... Basically, we can treat most of the systems like a queue so users send requests, the request process, and the response return to the user, or when the system is too busy to process the request right away, the request waits until some arbitrary timeout is reached or it will be processed. The real problem is to correctly identify the class of the system we are working on. In most cases, it will be the variation of M/M/c or Mx/m/c. Unfortunately, it can result in our calculations not being very in line with real life. As long as we are taking care of long-term average system performance then M/M/c is an appropriate description and most of the inconsistencies should be kept in line with averaged results.


SpiderLightning: Making WebAssembly cloud applications portable

A key element of this extensibility is the WebAssembly Component Model. Defined by the WebAssembly working group as the Wasm equivalent of an OS process model, it’s the foundation for how WASI implements its interfaces. A key element of any low-level approach like this is an interface definition language, which provides a way to specify how interfaces interact with code. For Wasm, and especially for the Component Model, the standard IDL is wit, which gives us a concise and human-readable way of defining interfaces that are expanded into WebAssembly code. To use WASI to build distributed applications, we need a set of extensions that lets us abstract provider-specific services as interfaces. Instead of having to use separate APIs for S3 on AWS and Blob storage on Azure and the code to manage them, we could have a single storage component that would provide a common set of interfaces on all platforms, with the underlying WASI instance managing service-specific implementations.


Attackers create 130K fake accounts to abuse limited-time cloud computing resources

Researchers refer to the abuse of free offers as freejacking, and the creation of accounts that incur charges and then are never paid as "play and run." The latter is more difficult to pull off because most service providers require the user to register a valid credit card or payment method before giving them access to paid-for computing resources. However, even if usage is tracked and charged on a per-minute basis, the bill is usually issued after a longer period. This gives attackers a time window to abuse such services. ... "The infrastructure architecture employed by the actors uses CI/CD techniques, in which each individual software component of an operation is placed within a container," the researchers said. "This container operates within a modular architecture within the larger mining operation. CI/CD architectures provide highly modular operational environments, allowing some components of an operation to fail, be updated, or even be terminated and replaced, without affecting the larger environment." Not all the containers are used for cryptomining. 


How to Get the Best Cyber-Insurance Deal

The first step in obtaining affordable cyber insurance is finding a broker who is well-versed in coverage terms and has access to several different insurance markets, says Mark Dobrow, a vice president in the insurance brokerage division of Segal, a human resources and employee benefits consulting firm. “Market knowledge and experience is limited due to the relative newness of the product as compared to the long history of standard property coverages,” he explains. “The right broker can tailor the coverage to your needs and should know which markets are best for a particular situation.” ... The biggest mistake cyber-insurance applicants make, Aiello says, is paying poor attention to detail. “Businesses must ensure technology is being deployed in line with the insurance firm's conditions, otherwise insurers can attempt to get out of paying a claim if the technology was not ‘properly implemented’,” he warns. Unfortunately, the language used in cyber-insurance policies isn't always consistent between providers. 


Southwest Airlines: ‘Shameful’ Technical Debt Bites Back

It’s been an open secret within Southwest for some time … that the company desperately needed to modernize its scheduling systems. … This problem — relying on older or deficient software that needs updating — is known as incurring technical debt [and it] appears to be a key factor in why Southwest Airlines couldn’t return to business as usual the way other airlines did after last [month’s] major winter storm. When hiccups or weather events happen, the employees have to go through a burdensome, arduous process … because Southwest hadn’t sufficiently modernized its crew-scheduling systems. For example, if … their flight was canceled … employees have had to manually call in to let the company know where they are [sometimes] being left on hold on the phone for … hours just to let the company know their whereabouts. … Online forums are full of employee accounts of such misery. … This can easily cascade to a systemwide halt. … Such breakdowns resulting from technical debt are often triggered by external events, like weather. … So why didn’t Southwest simply update its software and systems?


Top 3 trends experts predict to hit software development in 2023

While hackers are typically associated with cyberattacks, many of them also have a “broad, practical skillset” that can make them useful for parts of software development. That’s according to Alex Rice, the co-founder and CTO of HackerOne, which connects businesses with penetration testers and cybersecurity researchers. Rice believes that both hackers and external code reviewers will become a more integrated part of the software development processes in 2023, as more organisations adopt security reviews into their development process. “As the value of DevSecOps (development, security and operations) increases, we’ll see the line between hackers and developers blurring as hackers with development expertise become a core element of the software development processes,” Rice said. “There’s a lot of value hackers can bring when it comes to catching security risks earlier rather than later.” Earlier this year, Irish start-up Noloco raised $1.4m in seed funding for its platform, which lets companies build internal tools, portals and apps without writing a line of code.


Blind Eagle APT Hunts Banking Victims in Colombia, Ecuador

Based on Trend Micro's report, the APT is traditionally known to leverage publicly available remote access tools and Trojans such as njRAT, imminent monitor, ProyectoRAT, Warzone RAT, Async RAT, Lime RAT, Remcos RAT and BitRAT. Over time, the APT switches from one RAT to another. Continuing that trend, Blind Eagle is now using a modified version of the QuasarRAT, Check Point researchers say. The attack begins with phishing emails containing a booby-trapped link that deploys a Trojan named Quasar RAT. The APT used a geo-filter server in one campaign that redirects requests made from outside of Ecuador and Colombia to the website of the Ecuadorian Internal Revenue Service, suggesting the APT's targeting focus. The campaign not only drops a RAT but also employs a more complex infection chain. It abuses the legitimate mshta.exe binary to execute VBScript embedded in an HTML file to ultimately download two Python scripts, which adds a new stage in the infection chain. The first of the two, ByAV2.py, is an in-memory loader that runs a Meterpreter payload in DLL format.


What is Cython? Python at the speed of C

Enter Cython. The Cython language is a superset of Python that compiles to C. This yields performance boosts that can range from a few percent to several orders of magnitude, depending on the task at hand. For work bound by Python’s native object types, the speedups won’t be large. But for numerical operations, or any operations not involving Python’s own internals, the gains can be massive. ... Note that Cython’s approach is incremental. That means a developer can begin with an existing Python application, and speed it up by making spot changes to the code, rather than rewriting the whole application. This approach dovetails with the nature of software performance issues generally. In most programs, the vast majority of CPU-intensive code is concentrated in a few hot spots—a version of the Pareto principle, also known as the “80/20” rule. Thus, most of the code in a Python application doesn’t need to be performance-optimized, just a few critical pieces. You can incrementally translate those hot spots into Cython to get the performance gains you need where it matters most. 


5 ways to improve security automation

One essential part of security automation that you'll likely want to tackle before anything else is monitoring. If you don't know where your issues are, you won't catch critical exposures in your environment and won't know what to prioritize for automation. The bottom line is to automate monitoring your environment, if you do nothing else. Even if you don't have the time or budget to automate remediation, you can at least target areas for manual fixes. There are many fantastic articles and products around this space (including How we designed observability for a hybrid cloud platform), from basic infrastructure monitoring to code scanning to network vulnerability scanning and more. Another thing to keep in mind is transition planning. You won't get all of this done at once, nor should you. As you figure out what areas to target first and focus your automation efforts accordingly, you will undoubtedly face challenges rolling out your new standards and processes to your environment. If you didn't set standards before, you'll be met with resistance from teams with their own priorities and commitments to the business that don't understand why you're trying to change things on them.



Quote for the day:

"Leaders are more powerful role models when they learn than when they teach." -- Rosabeth Moss Kantor