Daily Tech Digest - October 04, 2022

One aspect is to implement change management on the automation, including the scripts, config files, and playbooks, used to manage the network. The use of code management tools helps with this: check-out and check-in events help staff remember to follow other parts of proper process. Applying change management at this level means describing the intended modifications to the automation, testing them, planning deployment, having a fallback plan to the previous known-good code where that is applicable, and determining specific criteria by which to judge whether the change succeeded or needs to be rolled back. ... Putting in place automation to lock-in a network state is a change management event, and in a sense, a change to the architecture; creating it and putting it into production needs to go through the whole approval and deployment process, and all future changes need to be made with its presence and operation in mind—considering it has to be part of future change management evaluations.


The Impact of Cybersecurity on Consumer Behavior

In addition to imperiling consumers’ PII, cyberattacks also cause consumers to feel helpless about their ability to protect their own data. According to ISACA’s survey report, about one in five consumers in the US, UK and Australia experience a sense of resignation that there is nothing they can do to protect themselves from cybercrimes. Nearly half of consumers in the US, UK and Australia think that they are likely to be a victim of cybercrimes. Although the initial cyberattack occurs just once, the lasting impacts of that attack continue for an unknown amount of time. If consumers’ data are stolen during cybercrimes and are subsequently sold to malicious actors, one attack can turn into a headache of fraud, identity theft and social engineering scams for the foreseeable future. Cyberattacks that compromise personal medical information in the healthcare industry or important account details in the financial services industry can cause emotional and financial stress. In the United States, the public is beginning to worry about state-sponsored cyberattacks against national security and defense systems and government agencies, in addition to their own personal information.


Digital transformation is brewing at Heineken

Heineken says it is fully committed to the path to net zero – and that there are efforts around the organisation to achieve this goal. Sustainability is top of mind in the strategies and tactics for digital transformation. “We have several fully green breweries,” said Osta. “This started in Austria a few years back with Goesser and is now being replicated in markets including France and Brazil. We also have 3D printers in 40 breweries, with 25 more in plan for this year. 3D printing on-site is very effective when it comes to spare parts management as it reduces carbon emissions. “There is also an incredible effort being made on the data side in terms of what we can estimate and measure. We are always looking at emerging data standards for better quality data to exchange across the ecosystem with our suppliers. The challenge is that often in sustainability we are faced with dark data – data that is critical but not collected or visible. “The corporate value chain (Scope 3) reporting requires an ecosystem approach of data exchange. 


Digital Identity Bill Passes Key Senate Milestone

The legislation stops short of mandating national IDs. It would create a task force to create standards and recommend a voluntary program for states, local, tribal and territorial governments to verify identities online for "high-value transactions." About a half-dozen states have already rolled out mobile drivers licenses in the pilot phase. Nationwide standards would help ensure these new IDs are secure and provide a guide for others states. Grant says online verification could be offered in a variety of forms, such as on-demand validation services, which could become part of the credit card application process, or a mobile app on smartphones that people could carry in their pockets. "Identity is very personal," Grant says. "You're probably going to need to create a few different channels for Americans to be able to tap into these authoritative sources. I'd be thrilled to have a mobile driver's license app on my phone. Others would say, 'I don't want to have an app from the government on my phone.'"


Do You Fit Cybercriminals’ Ideal Victim Profile?

The Bad Actors Know About You. And they know exactly why you keep putting off addressing your cybersecurity vulnerabilities. Don’t give attackers any more advantages when it comes to breaching your law practice. My advice?Be more reticent when it comes to sharing personal information on social media. (For example, if you work from home, register as an online business when you set up your Google Business profile so that your physical address and photos of your home won’t show up on Google Maps.) Be less trusting of seemingly friendly messages and emails that cross your transom. While technology solutions can greatly improve your defenses, humans are the last line of defense. Don’t click on attachments from unknown senders. If a large file arrives from someone you haven’t heard in for a long time, call them to say hello and ask about the email before you click. Be more vigilant in general — including asking qualified cybersecurity professionals to assess your current level of protection and recommend safeguards. Rereading this, even I got stressed.


Five Data-Loading Patterns To Boost Web Performance

No one likes a white blank screen, especially your users. Lagging resource loading waterfalls need a basic placeholder before you can start building the layout on the client side. Usually, you would use either a spinner or a skeleton loader. As the data loads one by one, the page will show a loader until all the components are ready. While adding loaders as placeholders is an improvement, having it on too long can cause a “spinner hell.” Essentially, your app is stuck on loading, and while it is better than a blank HTML page, it could get annoying, and visitors would choose to exit your site. ... Modern JavaScript frameworks often use client-side rendering (CSR) to render webpages. The browser receives a JavaScript bundle and static HTML in a payload, then it will render the DOM and add the listeners and events triggers for reactiveness. When a CSR app is rendered inside the DOM, the page will be blocked until all components are rendered successfully. Rendering makes the app reactive. To run it, you have to make another API call to the server and retrieve any data you want to load.


What Will it Take to End the Public Sector’s Cybersecurity Talent Gap?

The public sector can be deliberately hard to understand. From the multiple terms and acronyms used to describe programs and agencies, to an incredibly complex technological infrastructure, beginning a career in government can seem daunting. That is compounded when realizing even entry-level roles often require at least five years of experience. Many cybersecurity job descriptions highlight requirements for certifications and achievements, which can only be earned after a certain amount of time in the field. Instead of having such high expectations for entry-level candidates, which will only continue to leave hundreds of jobs unfilled, government agencies need to update their job descriptions to be truly entry-level and seek out college graduates or individuals who might have just completed a cybersecurity bootcamp or training program—and who have yet to gain any experience. It would also be beneficial to look at talent that might not come from a STEM field. Candidates with backgrounds in history or English can bring skills like analytical thinking and communication to the table—skills that are often a lot harder to teach than computer science.


8 strange ways employees can (accidently) expose data

Video conferencing platforms such as Zoom and Microsoft Teams have become a staple of remote/hybrid working. However, new academic research has found that bespectacled video conferencing participants may be at risk of accidently exposing information via the reflection of their eyeglasses. ... Users may not associate posting pictures on their personal social media and messaging apps as posing a risk to sensitive corporate information, but as Dmitry Bestuzhev, most distinguished threat researcher at BlackBerry, tells CSO, accidental data disclosure via social apps such as Instagram, Facebook, and WhatsApp is a very real threat. “People like taking photos but sometimes they forget about their surroundings. So, it’s common to find sensitive documents on the table, diagrams on the wall, passwords on sticky notes, authentication keys and unlocked screens with applications open on the desktop. All that information is confidential and could be put to use for nefarious activities.”


Used servers: Bargain or too good to be true?

Used equipment can run as well as new equipment “when you find the right seller,” says Peter Strahan, founder and CEO of Lantech, a managed IT services provider. “This allows you to rapidly cut the costs of a data center with used equipment.” In addition, deploying used IT equipment is generally good for the environment, Strahan says. “While the equipment could theoretically be recycled, it takes a lot of manpower,” he says. “Finding a use for it after it becomes obsolete saves a lot of time and money when it comes to recycling and stops the equipment going to the landfill.” A lot of companies “value the ‘green’ benefits of redeploying hardware,” says Cameron James, executive vice president of CentricsIT, a global IT services provider. “The best way to reduce IT waste is to use any product to its maximum lifespan, without compromising on performance. This is easy to do. Many used products are N-1—just one generation back from the latest OEM lines.” It can also make sense to buy used equipment if an organization has moderate powering needs in its data center, Strahan says. “If you have large powering needs, you will need the most efficient equipment,” he says.


Carbon copies: How to stop data retention from killing the planet

So what can be done about it? It is a question that has been plaguing the IT industry for years, and the lack of a definitive answer often makes it easier to just turn on another air-conditioning unit and look the other way. But that’s causing even more harm. So what are the alternatives? Storing less data appears to be an obvious answer, but it would be almost impossible to implement, because who decides what parameters are worth recording and what are not? The BBC learned this the hard way when it trashed much of its TV archive during the 1970s and 1980s, assuming that it would be no use. Then came the VCR, the DVD player and, of course, streaming. Ask any Doctor Who fan and they will grimace at the number of early episodes of the long-running Sci-Fi series that have been lost, perhaps forever, because of a lack of foresight. So, that’s the case to justify digital hoarding. But it all has to be stored somewhere, and those facilities have to be environmentally controlled.



Quote for the day:

"Leadership cannot really be taught. It can only be learned." -- Harold S. Geneen

Daily Tech Digest - October 03, 2022

Roadmap to RPA Implementation: Thinking Long Term

Ted Kummert, executive vice president of products and engineering at UiPath, says RPA should be viewed as a long-range capability meant to empower organizations to evolve strategically and increase business value. It is a journey that can start small, within one division or one department, and grow organically across the business as additional ideas form and the organization’s vision for automation’s potential comes to fruition. He says RPA can clear backlog, create new capacity, and free up resources, and improve data quality by integrating software robots into workflows. “It is a truly transformative technology that can reduce or eliminate manual tasks and elevate creative, high-value work,” Kummert says. “Digital transformation is often talked about, but many times can fall short of its goals. Automation is the driver to achieve true digital transformation.” Adam Glaser, senior vice president of engineering for Appian, says many businesses use one automation technology, adding third-party capabilities in patchwork fashion to automate complex end-to-end processes.


How to start and grow a cybersecurity consultancy

To be successful, an entrepreneur must be resilient. Any comment that runs along the lines of “That’s not possible,” or “That can’t be done” should be treated as a challenge to prove the speaker wrong. An entrepreneur needs to have the ability to see through what’s not important. Entrepreneurs don’t just need money – they also need support in the form of encouragement and advice. I would advise budding entrepreneurs to attend meetups within their industry or local community and seek out online support via forums and groups. You’ll be surprised just how willing others will be to help and offer advice for free. Asking questions, getting reassurance and sanity checks from peers can be invaluable at all stages of your businesses journey. There will always be someone a little further down the path you’re taking. Starting a business can be exhilarating, rewarding and fun, but can be exhausting, relentless and stressful in equal measure.


Surveillance tech firms complicit in MENA human rights abuses

“When operating in conflict-affected or high-risk regions as the MENA region, the surveillance sector must undertake heightened human rights due diligence and, if it cannot do so or it identifies evidence of harm, it should stop selling its technology to companies or governments,” said Dima Samaro, MENA regional researcher and representative at the Business & Human Rights Resource Centre. “Lack of adequate due diligence measures by private companies will only worsen the situation for those from marginalised communities, putting their lives in jeopardy as the absence of robust regulation and effective mechanisms in the region allows surveillance technologies to be operated freely and without scrutiny.” The report added that, although the United Nations’ (UN) Guiding principles on business and human rights were adopted a decade ago – which establish that companies must take proactive and ongoing steps to identify and respond to the potential or actual human rights impacts of their business – the principles’ non-binding, voluntary nature means there are “glaring gaps in human rights safeguards” at the firms.


How companies can accelerate transformation

Ensuring that customer value drives technology architecture and investment is one way to optimize technology usage. Another way is to ensure that an organization is getting the most out of the investments it has already made. Inefficiency in any aspect of technology usage represents a drag on businesses’ ability to change quickly. ... While enterprise architects (EAs) play a central role in identifying opportunities for this type of technology optimization, they have an even greater role to play when it comes to optimizing the entire IT landscape. A “business capability” perspective makes this possible. ... Efficiency doesn’t improve on its own. The business needs to decide to improve it. Making those decisions, however, is not always easy. As mentioned, relying on business capabilities to evaluate technology needs is one way to simplify the decision process. The other is visibility. Business leaders can’t make decisions if they can’t see the problem. In terms of business architecture, EAs help guide leaders in the decisions they make by showing them business capability maps, data-rich process diagrams and dashboards highlighting the connection between architectural issues and business value.


Optus reveals extent of data breach, but stays mum on how it happened

Optus says its recent data breach impacted 1.2 million customers with at least one form of identification number that is valid and current. The Australian mobile operator also has brought in Deloitte to lead an investigation on the cybersecurity incident, including how it occurred and how it could have been prevented. Optus said in a statement Monday that Deloitte's "independent external review" of the breach would encompass the telco's security systems, controls, and processes. It added that the move was supported by the board of its parent company Singtel, which had been "closely monitoring" the situation. Elaborating on Deloitte's forensic assessment, Optus CEO Kelly Bayer Rosmarin said: "This review will help ensure we understand how it occurred and how we can prevent it from occurring again. It will help inform the response to the incident for Optus. This may also help others in the private and public sector where sensitive data is held and risk of cyberattack exists." In its statement, Optus added that it had worked with more than 20 government agencies to determine the extent of the data breach.


Why cyber security strategy must be more than a regulatory tick-box exercise

While technology plays a critical role in an effective cyber security strategy, it alone does not provide the solution. Business leaders must also consider the organisation’s processes and people. If organisations don’t have the right processes or people in place to manage new technologies, it can be easy to revert to old habits. Many organisations opt for a hybrid Security Operations Centre to underpin their MDR strategy, which combines the cyber skills of in-house engineers, cyber security teams and an MSSP to create a single facility. MSSPs fill in the gaps in defences while upskilling in-house teams to stay on top of changing threats and technologies. This approach can also free in-house staff to drive projects and internal improvements while the MSSP takes the lead on high value incidents. If the goal is to improve cyber security whilst meeting your organisational goals, then regulations will only ever go so far in tackling the issue. Attacks will continue to plague all sectors and proper detection, response and remediation will be what makes the difference between those that make the news and those that don’t.


Mozilla is looking for a scapegoat

Not so long ago, Microsoft’s Internet Explorer dominated market share. Antitrust authorities helped change that, but Google, not Mozilla, stepped up to take Microsoft’s place, yet without the bully pulpit of a dominant operating system. Meanwhile, as far back as 2008, I was writing about Mozilla’s chance to make Firefox a true community-developed web platform. It didn’t succeed, though Mozilla has gifted us incredible innovations such as Rust. Clearly there are smart people at Mozilla and they have demonstrated the ability to push the envelope on innovation. But not with Firefox. DuckDuckGo has carved out a growing, sizeable niche in privacy-oriented search, but Mozilla keeps losing similar ground in browsers. Why? In its report, Mozilla says browser freedom has been “suppressed for years through online choice architecture and commercial practices that benefit platforms and are not in the best interest of consumers, developers, or the open web.” This would be more credible in Mozilla’s mouth if this weren’t the same company that completely mismanaged its entrance into the mobile market.


Indonesia Data Protection Law Includes Potential Prison Time

The Indonesia data protection law took some eight years to come to fruition, with contentious ongoing debate about what government body should oversee the new regulations and exactly how strong the penalties should be. A recent wave of cyber attacks and data breaches in the country seems to have prompted legislative action; Kaspersky reports that the country experienced 11.8 million cyberattacks in the first quarter of 2022, a 22% increase from the prior year, and the country has become the leading target for ransomware attacks in Southeast Asia. This includes data breaches of various government agencies, one of which exposed the vaccination records of President Joko Widodo. Stats from SurfShark indicate that Indonesia now has the third-highest rate of data breaches in the world. Regulation oversight has fallen to the executive branch, with the President slated to form an oversight body tasked with determining and administering fines. Similar to the EU’s General Data Protection Regulation (GDPR), which the Indonesia data protection law drew from substantially, there is a maximum potential fine of 2% of global annual turnover for violations.


How To Protect Your Reputation After A Hack Or Data Breach

Part of transparency and recovery is working with the relevant authorities and experts to track the scope of the breach. A post-mortem analysis can be critical. For one thing, it can determine what data was stolen, by who and how. It can also help track where that data ends up and how it is used. In cases where the cause has something to do with software or hardware being exploited, it can be essential to inform the developers or manufacturers of the breach and how it occurred. They may also need to issue patches or recalls to prevent other businesses using that hardware or software from being compromised. No business stands alone. ... Recovery after a breach is a sensitive time. You will undoubtedly see a deluge of negative reviews and bad press, which will be difficult to counteract. Clear and transparent messaging is part of it; breaches happen, and there's no surefire way to avoid them. Demonstrating that your data security policies prevented usable data from being stolen or that you've been able to protect users proactively can be critical to repairing your reputation.


Data quality is at the heart of successful data governance

The downstream effects of data quality have ramifications felt throughout data governance efforts. Recent findings from a survey by Enterprise Strategy Group showed that data management is greatly challenged by a lack of visibility and compounded by data quality issues. Concerningly, 42 percent of all respondents indicated at least half of their data was “dark data” - retained by the organization, but unused, unmanageable, and unfindable. An influx in dark data and a lack of data visibility often leads to downstream bottlenecks, impeding the accuracy and effectiveness of operational data. Data quality was the top driver for organizations’ data governance programs but was also the top challenge that these organizations have to overcome to maximize the return on their data governance efforts. When you consider the fact that many organizations are experiencing data quality issues, which are difficult to manage, and in many cases have significant amounts of data that is dark, there is a clear need for more robust data governance solutions providing data landscape transparency united with business context and guidance.



Quote for the day:

"Perhaps the ultimate test of a leader is not what you are able to do in the here and now - but instead what continues to grow long after you're gone" -- Tom Rath

Daily Tech Digest - October 02, 2022

CIOs Still Waiting for Cloud Investments to Pay Off

“You discover your cloud architecture is immature when you’re surprised by the bill,” said Mr. Roese. “Usually the reason it was expensive is that they processed the data in a suboptimal way, in the wrong place with the wrong tools with the wrong economic model.” Roughly 67% of 1,000 senior technology leaders at U.S. firms across industries said they have yet to see a significant return on cloud investments, KPMG said Thursday in its annual technology survey. “Many first movers expected significant IT cost efficiency from their cloud investments,” said Barry Brunsman, a principal in KPMG’s CIO Advisory group. “We have seen a shift away from that expectation in favor of speed and agility.” Mr. Brunsman said the most common issues preventing a better return on cloud spending were insufficient talent or skills among tech teams, added security and compliance requirements, and a misalignment on expected outcomes. Global cloud spending is projected to reach $830.5 billion by the end of the year, up 17.5% from 2021, but slowing from last year’s growth rate of 18.3%, according to International Data Corp. It expects growth to drop to 16.3% next year.


To Code or Not to Code: The Benefits of Automating Software Testing

The hoped-for solution to these problems has been test automation. This offers a way to reduce manual involvement, test greater volumes, remove the risk of human error and accelerate time to market as much as tenfold for a serious competitive advantage. Yet, even organizations that have rolled out automated testing have discovered that a big problem remains. They can’t easily scale these solutions because professional-grade coding skills are still required. Even if they are marketed as “low-code,” they’re still far too complex for business users. Even testers usually lack the coding skills to set up tests on their own. As a result, coding skills remain a resource bottleneck that slows down the testing process and limits collaboration with business users. What’s more, as automated testing becomes more pervasive throughout the organization, the maintenance workload grows. All those test engineers an organization hired to implement an automation framework increasingly spend their time maintaining code instead of building bigger and smarter test scopes. Scaling becomes impossible.


Where JavaScript is headed in 2022

Angular is showing ominous signs of weakness around retention and interest, ranking near the bottom at #9. Nevertheless, it remains #2 for actual usage, and #3 for awareness. Vue continues to be a strong contender, with a decent ranking across all categories. Overall the story on the front end is of incremental refinements, rather than revolutionary upheaval. And on the back end? Next.js instigated the full-stack JavaScript movement, and remains second only to Express in both awareness and usage. The comparison of Next to Express is of course imperfect. Express is a server-side framework only, the workhorse of Node-based HTTP. ... Another interesting discovery in relation to languages that compile to JavaScript is the popularity of Elm. Elm is an ingenious functional language geared for web development, and highly regarded for its enablement of fast and fluent applications. But it’s also a mothballed project without any commits for months. The takeaway? Clearly the basic ideas in Elm are still desirable and popular. Perhaps a new leader could take up the project and carry it forward to the benefit of the entire ecosystem.


Blockchain: An Immutable Future?

The general consensus is that the pandemic delayed the adoption of distributed ledger technologies. Companies worldwide felt the repercussions of supply chain disruption and changes to consumer habits, which meant that the implementation of blockchain fell low on priority lists. However, in some cases, blockchain technology was used effectively to coordinate logistics. Despite the limited use, the global crisis helped drive further discussion about blockchain’s benefits; for example, ledger technologies could potentially have helped counter the fake vaccines that flooded the market during the height of the pandemic. This topic also feeds into a wider conversation about the influx of counterfeit medicines in pharma supply chains. ... The computing power needed to mine (add information to the blockchain) plus the duplication of work is an obvious source of environmental concern. Beyond environmental challenges, every piece of data added to a blockchain needs to be transcribed onto every copy of the blockchain, resulting in a much greater cost than server or cloud storage. 


The 5 Absolute Best X-Factors That Increase Enterprise Value

The problem with the traditional business model is that you don’t know when clients return, if at all. Both cash flow and forecasting are problematic. The first step in transforming your business model is implementing recurring revenue. Again, you can look to Software as a Service (SaaS) for inspiration. Chances are, you pay a monthly fee for your favorite movie streaming service. Customers don’t miss hidden billing surprises as they know exactly what they’ll pay. You enjoy the predictability of cash flow and now have accurate budgets. The second step in transforming your business model is creating long-term exclusive contracts. Your mission is to show clients why they are better off with a long-term commitment. ... A rich and thriving culture is the foundation of your business. It’s your culture that plays a role in the customer experience. Your culture determines if you leverage a market opportunity or not. Buyers look for a culture that promotes resilience, innovation, and accountability. Your future buyer knows that people come and go in a business, including you. A rich and thriving culture transcends people and takes a life of its own.


Three Ways Banks Can Engage Younger Consumers in the Metaverse

The metaverse lets banks roll out the virtual red carpet for customers, with tailored experiences for specific segments and personas. Personalized virtual banking enables that special something that leaves customers feeling valued. Within a metaverse branch, banks can create virtual rooms in which avatars of relationship managers and customer advisors work one-on-one with high-net-worth individuals, for instance. They might also provide services to individuals looking to create a college fund or businesses interested in obtaining loans. Metaverse banking’s combination of personalization and community puts a fresh, modern spin on CX, and it’s an especially powerful draw for young banking consumers who are critical to the future of banking. ... To connect with the next generation of connected consumers, banks must begin building their presence among the more popular metaverses and increase engagement with younger demographic audiences through 3D banking, personalized services and DAOs. The good news? For payment providers and retail and commercial banks, there are no obstacles surrounding the preparation, and it is not too late to get ahead. 


Biology Inspires a New Kind of Water-Based Circuit That Could Transform Computing

Since this is closer to the way the brain transports information, they say, their device could be the next step forward in brain-like computing. "Ionic circuits in aqueous solutions seek to use ions as charge carriers for signal processing," write the team led by physicist Woo-Bin Jung of the Harvard John A. Paulson School of Engineering and Applied Sciences (SEAS) in a new paper. "Here, we report an aqueous ionic circuit… This demonstration of the functional ionic circuit capable of analog computing is a step toward more sophisticated aqueous ionics." A major part of signal transmission in the brain is the movement of charged molecules called ions through a liquid medium. Although the incredible processing power of the brain is extremely challenging to replicate, scientists have thought that a similar system might be employed for computing: pushing ions through an aqueous solution. This would be slower than conventional, silicon-based computing, but it might have some interesting advantages.


Failure of Russia’s cyber attacks on Ukraine is most important lesson for NCSC

She said three things could be attributed to the “unexpected” lack of success – “impressive” Ukrainian cyber defences, “incredible” support from the cyber security sector and “impressive collaboration” between the US, EU, Nato, the UK, and others. “Just as we have seen inspirational and heroic defence by Ukrainian military on the battlefield, we have seen incredibly impressive defensive cyber operations by Ukrainian cyber security practitioners. Many commentators have suggested that this has been the most effective defensive cyber activity undertaken under sustained pressure in history,” added Cameron. She said the constant cyber attacks on Ukraine, emanating from Russia, over the past decade had prepared the country’s cyber defences. “In many ways, Russia has made Ukraine match fit over the past 10 years by consistently attacking them,” said Cameron. “We haven’t seen ‘cyber Armageddon’, but that’s not a surprise to cyber professionals, who never expected it. What we have seen is a very significant conflict in cyber space – probably the most sustained and intensive cyber campaign on record.”


Feds: Chinese Hacking Group Undeterred by Indictment

The United States began publicly indicting Chinese hackers in 2014 in a strategy to pressure Beijing by exposing the organizations and individuals behind state-sponsored cybertheft. The strategy seemed to pay dividends when Chinese leader Xi Jinping in September 2015 pledged to cease cyber-enabled economic espionage. The strategy's utility has since come under mounting fire as it became apparent that Chinese state-sponsored hacking responded to Xi's promise by becoming stealthier rather than by ending. "The indictment did not hinder APT41's operations as they progressed into 2021," concludes the Department of Health and Human Services' Health Sector Cybersecurity Coordination Center in a Thursday threat brief. "Stealing foreign IP is a primary objective of state-sponsored Chinese cyberespionage groups such as APT41, as it contributes to China's ambitious business and economic development goals," says Paul Prudhomme, a former Department of Defense threat analyst who is head of threat intelligence advisory at Rapid7.


What Is Artificial Intelligence in Software Testing?

You may wonder, “Don’t test automation tools do this already?” Of course, test automation tools already have AI in effect, but they have limitations. Where AI shines in software development is when it’s applied to remove those limitations, enabling software test automation tools to provide even more value to developers and testers. The value of AI comes from reducing the direct involvement of the developer or tester in the most mundane tasks. We still have a great need for human intelligence in applying business logic, strategic thinking, creative ideas, and the like. For example, consider that most, if not all, test automation tools run tests for you and deliver results. Most don’t know which tests to run, so they run all of them or some predetermined set. What if an AI-enabled bot could review the current state of test statuses, recent code changes, code coverage, and other metrics, and then decide which tests to run and run them for you? Bringing in decision-making that’s based on changing data is an example of applying AI. Good news! Parasoft handles automated software testing at this level.



Quote for the day:

"It's not the position that makes the leader. It_s the leader that makes the position." -- Stanley Huffty

Daily Tech Digest - October 01, 2022

3 wins and 3 losses for cloud computing

The cloud can successfully provide business agility. I always tell my clients that most enterprises moved to cloud for the perceived cost savings but stayed for the agility. Cloud provides businesses with the ability to turn IT on a dime, and enterprises that move fast in today’s more innovative markets (such as retail, healthcare, and finance) find that the speed at which cloud systems can change provides a real force multiplier for the business. The cloud offers industrial-strength reliability. Most who pushed back on cloud computing argued that we would put all our eggs in a basket that could prove unreliable. ... The businesses that moved to cloud computing anticipated significant cost savings. Those savings never really materialized except for completely new businesses that had no prior investment in IT. In fact, most enterprises looked at their cloud bills with sticker shock. The primary culprit? Enterprises that did not use cloud finops programs to effectively manage cloud costs. Also, cloud providers offered pricing and terms that many enterprises did not understand (and many still don’t).


Data storytelling: A key skill for data-driven decision-making

Rudy is a firm believer in letting the data unfold by telling a story so that when the storyteller finally gets to the punch line or the “so what, do what” there is full alignment on their message. As such, storytellers should start at the top and set the stage with the “what.” For example, in the case of an IT benchmark, the storyteller might start off saying that the total IT spend is $X million per year (remember, the data has already been validated, so everyone is nodding). The storyteller should then break it down into five buckets: people, hardware, software, services, other (more nodding), Rudy says. Then further break it down into these technology areas: cloud, security, data center, network, and so on (more nodding). Next the storyteller reveals that based on the company’s current volume of usage, the unit cost is $X for each technology area and explains that compared to competitors of similar size and complexity, the storyteller’s organization spends more in certain areas, for example, security (now everyone is really paying attention), Rudy says.


Active vs. Passive Network Monitoring: What Should Your Company Use?

Active network monitoring, also known as synthetic network monitoring, releases test traffic onto the network and observes that traffic as it travels through. This traffic is not taken from actual transactions that occur on a network, but rather sent through the network in order for your monitoring solution to examine it on its path. Test traffic usually mimics the typical network traffic that flows through your system so your administrators will gain the most relevant insights to its network. ... Passive network monitoring refers to capturing network traffic that flow through a network and analyzing it afterwards. Through a collection method like log management or network taps, passive monitoring compiles historic network traffic to paint a bigger picture of your company’s network performance. The primary use for passive network monitoring is for discovering and predicting performance issues that happen at specific instances and areas of your network. ... The question that might be passing through your mind is “should my business use active monitoring or passive monitoring for my network performance strategy?”


A New Linux Tool Aims to Guard Against Supply Chain Attacks

“Not too long ago, the only real criteria for the quality of a piece of software was whether it worked as advertised. With the cyber threats facing Federal agencies, our technology must be developed in a way that makes it resilient and secure,” Chris DeRusha, the US federal chief information security officer and deputy national cyber director, wrote in the White House announcement. "This is not theoretical: Foreign governments and criminal syndicates are regularly seeking ways to compromise our digital infrastructure.” When it comes to Wolfi, Santiago Torres-Arias, a software supply chain researcher at Purdue University, says that developers could accomplish some of the same protections with other Linux distributions, but that it’s a valuable step to see a release that’s been stripped down and purpose-built with supply chain security and validation in mind. “There’s past work, including work done by people who are now at Chainguard, that was kind of the precursor of this train of thought that we need to remove the potentially vulnerable elements and list the software included in a particular container or Linux release,” Torres-Arias says.


Using governance to spur, not stall, data access for analytics

“Without good governance controls, you not only have the policy management risk, but you also risk spending much, much more money than you intend, much faster,” says Barch. “We knew that maximizing the value of our data, especially as the quantity and variety of that data scales, was going to require creating integrated experiences with built-in governance that enabled the various stakeholders involved in activities like publishing data, consuming data, governing data and managing the underlying infrastructure, to all seamlessly work together.” What does this blended approach to data governance look like? For Capital One, it’s what Barch calls “sloped governance.” With a sloped governance approach, you can increase governance and controls around access and security for each level of data. For example, private user spaces, which don’t contain any shared data, can have minimal data governance requirements. As you move further into production, the controls get stricter and take more time to be implemented.


Microsoft: Hackers are using open source software and fake jobs in phishing attacks

The hacking group has targeted employees in media, defense and aerospace, and IT services in the US, UK, India, and Russia. The group was also behind the massive attack on Sony Pictures Entertainment in 2014. Also known as Lazarus, and tracked by Microsoft as ZINC, Google Cloud's Mandiant threat analysts saw the group spear-phishing targets in the tech and media sectors with bogus job offers in July, using WhatsApp to share a trojanized instance of PuTTY. "Microsoft researchers have observed spear-phishing as a primary tactic of ZINC actors, but they have also been observed using strategic website compromises and social engineering across social media to achieve their objectives," MSTIC notes. "ZINC targets employees of companies it's attempting to infiltrate and seeks to coerce these individuals into installing seemingly benign programs or opening weaponized documents that contain malicious macros. Targeted attacks have also been carried out against security researchers over Twitter and LinkedIn."


New deepfake threats loom, says Microsoft’s chief science officer

In a Twitter thread, MosaicML research scientist Davis Blaloch described interactive deepfakes as “the illusion of talking to a real person. Imagine a scammer calling your grandmom who looks and sounds exactly like you.” Compositional deepfakes, he continued, go further with a bad actor creating many deepfakes to compile a “synthetic history.” ... The rise of ever-more sophisticated deepfakes will “raise the bar on expectations and requirements” of journalism and reporting, as well as the need to foster media literacy and raise awareness of these new trends. In addition, new authenticity protocols to confirm identity might be necessary, he added – even new multifactor identification practices for admittance into online meetings. There may also need to be new standards to prove content provenance, including new watermark and fingerprint methods; new regulations and self-regulation; red-team efforts and continuous monitoring.


How Does WebAuthn Work?

WebAuthn is quite clever. It leverages the power of public key cryptography to create a way for users to log in to mobile and web applications without those applications having to store any secret information at all. Usually, when one thinks of public key cryptography, one thinks of using it to send a secret message to a person who then decrypts it and reads it. Well, this can kind of work in reverse. If you send them a message encrypted with their public key, then they – and only they – are the only ones who can decrypt because only they have the private key that corresponds to the given public key. Once they do, you can be highly confident that they are the entity that they say they are. Currently, all the major browsers – Chrome, Firefox, Edge, and Safari – all support the WebAuthn specification. If your phone – iPhone or Android – has a fingerprint reader or facial scanner, it supports WebAuthn. Windows provides WebAuthn support via Windows Hello. All of this translates to passwordless authentication quite nicely.


Why developers hold the key to cloud security

APIs drive cloud computing. They eliminate the requirement for a fixed IT architecture in a centralized data center. APIs also mean attackers don’t have to honor the arbitrary boundaries that enterprises erect around the systems and data stores in their on-premises data centers. While identifying and remediating misconfigurations is a priority, it’s essential to understand that misconfigurations are just one means to the ultimate end for attackers: control plane compromise. This has played a central role in every significant cloud breach to date. Empowering developers to find and fix cloud misconfigurations when developing IaC is critical, but it’s equally important to give them the tools they need to design cloud architecture that’s inherently secure against today’s control plane compromise attacks. ... Developers are in the best (and often only) position to secure their code before deployment, maintain its secure integrity while running, and better understand the specific places to provide fixes back in the code. But they’re also human beings prone to mistakes operating in a world of constant experimentation and failure. 


IT enters the era of intelligent automation

Companies also need to optimize business processes to increase the effectiveness of automation, Nallapati says. “Working together in a partnership, the business unit and the automation teams can leverage their expertise to refine the best approach and way forward to optimize the efficiency of the bot/automation,” she says. Technology leaders should make sure to get business leaders and users involved in the IA process, Ramakrishnan says. “Educate them about the possibilities and collaborate with them in joint problem-solving sessions,” he says. ... “With a large number of customers and a large number of invoices to process every day, any small savings through automation goes a long way in increasing productivity, accuracy, and improving employee and end customer satisfaction,” Ramakrishnan says. Similar to the type of hackathons that are common in IT organizations today, Ramakrishnan says, “we partnered with the business to have a business-side hackathon/ideathon. We educated the key users from the billing team on the possibilities of automation, and then they were encouraged to come back with ideas on automation.”



Quote for the day:

"Effective team leaders realize they neither know all the answers, nor can they succeed without the other members of the team." -- Katzenbach & Smith

Daily Tech Digest - September 30, 2022

5 Signs That You’re a Great Developer

Programming directly changes your brain to work in another way, you’re starting to think more algorithmically and solve problems faster, so it really affects other aspects of your life Good programmers not only can learn anything else much faster, especially if we’re talking about tech-related directions, but also they are great examples of entrepreneurs and CEO. Look at Elon Musk, for instance, he was a programmer and built his own game when he was 12. ... As we briefly discussed previously, programming encourages creative thinking and teaches you how to approach problems in the most effective way. But in order to do so, you must first be able to solve a lot of these difficulties and have a passion for doing so; only then will you probably succeed as a developer. If you’ve just started and thought that easy, then you’re completely wrong. You just haven’t figured out genuinely challenging problems, and the more you learn, the more difficult and complex the difficulties get. Because you need to not only solve it, but also solve it in the most effective way possible, speed up your algorithm, and optimize everything. 


Experimental WebTransport over HTTP/3 support in Kestrel

WebTransport is a new draft specification for a transport protocol similar to WebSockets that allows the usage of multiple streams per connection. WebSockets allowed upgrading a whole HTTP TCP/TLS connection to a bidirectional data stream. If you needed to open more streams you’d spend additional time and resources establishing new TCP and TLS sessions. WebSockets over HTTP/2 streamlined this by allowing multiple WebSocket streams to be established over one HTTP/2 TCP/TLS session. The downside here is that because this was still based on TCP, any packets lost from one stream would cause delays for every stream on the connection. With the introduction of HTTP/3 and QUIC, which uses UDP rather than TCP, WebTransport can be used to establish multiple streams on one connection without them blocking each other. For example, consider an online game where the game state is transmitted on one bidirectional stream, the players’ voices for the game’s voice chat feature on another bidirectional stream, and the player’s controls are transmitted on a unidirectional stream. 
Software builders across Amazon require consistent, interoperable, and extensible tools to construct and operate applications at our peculiar scale; organizations will extend on our solutions for their specialized business needs. Amazon’s customers benefit when software builders spend time on novel innovation. Undifferentiated work elimination, automation, and integrated opinionated tooling reserve human interaction for high judgment situations. Our tools must be available for use even in the worst of times, which happens to be when software builders may most need to use them: we must be available even when others are not. Software builder experience is the summation of tools, processes, and technology owned throughout the company, relentlessly improved through the use of well-understood metrics, actionable insights, and knowledge sharing. Amazon’s industry-leading technology and access to top experts in many fields provides opportunities for builders to learn and grow at a rate unparalleled in the industry. As builders we are in a unique position to codify Amazon’s values into the technical foundations; we foster a culture of belonging by ensuring our tools, training, and events are inclusive and accessible by design.


The Troublemaker CISO: How Much Profit Equals One Life?

We take for granted that those who are charged with protecting us are doing so with our best interest at heart. There is no shaving off another few cents just to increase value to shareholders over the life of a person. Lucky for me, there is a shift in the boardrooms and governing bodies to see how socially responsible you are and whether you are acting in the best interest of the people and not just the investors. If the members of the board and governing body are considering these topics when steering a business, isn't it time to relook at how and why we do things? Are we as CISOs not accountable to leadership to impress on them the risk that IOT/internet connectivity poses to critical networks - and especially to healthcare? It is time to be firm in expressing the risk and saying we would rather spend a bit more money and time and do it the safe way. And this should be listed as the top risk in the company. The other big issue I have with this type of network being connected is one of transparency.


Digital Twins Offer Cybersecurity Benefits

A key difficulty, from a cybersecurity perspective, is the fact drug production lines are made up of multiple different technologies, running different operating systems that are often provided by different suppliers. “Integrating multiple systems from different suppliers can provide expanded attack surface that can be exploited by cyber adversary,” continues Mylrea. To address this, Mylrea and Grimes developed what they refer to as “biosecure digital twins”—replicas of manufacturing lines they use to identify potential points of attack for hackers. “The digital twin is essentially a high-fidelity virtual representation of critical manufacturing processes. From a security perspective, this improves monitoring, detection, and mitigation of stealthy attacks that can go undetected by most conventional cybersecurity defenses,” explains Mylrea. “Beyond security, the biosecure digital twin can optimize performance and productivity by detecting when critical systems deviate from their ideal state and correct in real time to enable predictive maintenance that prevent costly faults and safety failures.”


Unlocking cyber skills: This year’s essential back-to-school lesson plan

Technology is continually advancing, which will only create more avenues for cybersecurity roles in the future. While it’s essential to inform students about the types of careers in cybersecurity, teachers and career advisors should be aware of the skills and qualities the sector needs beyond technical computer and software knowledge. Once this is achieved, it can shed light on the roles students can go onto. Technical skills are critical in cybersecurity, yet they can be learned, fostered, and evolved throughout a student’s career. Schools need to tap into individual students’ strengths in hopes of encouraging them to pursue cyber positions. Broadly, cybersecurity enlists leaders, communicators, researchers, critical thinking… the list goes on. Having the qualities needed to fulfil various roles in the industry can position a student remarkably when they first start in the industry. Yet, this comes down to their mentors in high school being able to communicate that a student’s inquisitive nature or presenting skills can be applied to various sectors.


Data literacy: Time to cure data phobia

Data literacy is an incredibly important asset and skill set that should be demonstrated at all levels of the workplace. In simple terms, data literacy is the fundamental understanding of what data means, how to interpret it, how to create it and how to use it both effectively and ethically across business use cases. Employees who have been trained in and applied their knowledge of how to use company data demonstrate a high level of data literacy. Although many people have traditionally associated data literacy skills with data professionals and experts, it’s becoming necessary for employees from all departments and job levels to develop certain levels of data literacy. The Harvard Business Review stated: “Companies need more people with the ability to interpret data, to draw insights and to ask the right questions in the first place. These are skills that anyone can develop, and there are now many ways for individuals to upskill themselves and for companies to support them, lift capabilities,and drive change. Indeed, the data itself is clear on this: Data-driven decision-making markedly improves business performance.”


To BYOT & Back Again: How IT Models are Evolving

The growing complexity of IT frameworks is startling. A typical enterprise has upwards of 1,200 cloud services and hundreds of applications running at any given moment. On top of that, employees have their own smartphones, and many use their own routers and laptops. Meanwhile, various departments and groups -- marketing, finance, HR and others -- subscribe to specialized cloud services. The difficulties continue to pile up -- particularly as CIOs look to build out more advanced data and AI frameworks. McKinsey & Company found that between 10% and 20% of IT budgets are devoted to adding more technology in an attempt to modernize the enterprise and pay down technical debt. Yet, part of the problem, it noted, is “undue complexity” and a lack of standards, particularly at large companies that stretch across regions and countries. In many cases, orphaned and balkanized systems, data sprawl, data silos, and complex device management requirements follow. For CIOs seeking simplification and tighter security, the knee-jerk reaction is often to clamp down on choices and options.


IT leadership: What to prioritize for the remainder of 2022

To deliver product-centric value, it’s best to have autonomous, cross-functional teams running an Agile framework. Those teams can include technical practitioners, design thinkers, and business executives. Together, they can increase business growth by as much as 63%, Infosys’ Radar report uncovered. Cross-pollination efforts can spread Agile across the entire enterprise, building credibility and trust among high-level stakeholders toward an iterative process that can deliver meaningful, if incremental, business results. Big-bang rollouts, with a raft of modernizations released in one fell swoop, may seem attractive to management or other stakeholders. But they carry untold risk: developers scrambling to fix bugs after the fact, account teams working to retain disgruntled customers. Approach cautiously, and consider an Agile roadmap of smaller, iterative developments instead of the momentous release. It also breaks down the considerable task of application modernization into smaller, bite-sized chunks. 


How Policy-as-Code Helps Prevent Cloud Misconfigurations

Policy-as-code is a great cloud configuration solution because it eliminates the potential for human error and makes it more difficult for hackers to interfere. Policy compliance is crucial for cloud security, ensuring that every app and piece of code follows the necessary rules and conditions. The easiest way to ensure nothing slips through the cracks is to automate the compliance management process. Policy-as-code is also a good choice in a federated risk management model. A set of common standards are applied across a whole organization, although departments or units retain their own methods and workflows. PaC fits seamlessly into this high-security system by scaling and automating IT policies throughout a company. Preventing cloud misconfiguration relies on effectively ensuring every app and line of code is adhering to an organization’s IT policies. PaC offers some key benefits that make this possible without being a hassle. Policy-as-code improves the visibility of IT policies since everything is clearly defined in code format. 



Quote for the day:

"Without courage, it doesn't matter how good the leader's intentions are." -- Orrin Woodward