Daily Tech Digest - March 02, 2023

Cyberattackers Double Down on Bypassing MFA

MFA flooding, where an attacker will repeatedly attempt to log in using stolen credentials to create a deluge of push notifications, aims at taking advantage of users' fatigue for security warnings. "Push notifications are a step up from SMS, but are susceptible to MFA flooding and MFA fatigue attacks, bombarding the victim with notifications in the hope they will click 'Allow' on one of them," Caulfield says. Another popular tactic — the account reset attack — aims to fool tech support into giving attackers control of a targeted account, an approach that led to the successful compromise of the developer Slack channel for Take-Two Interactive's Rockstar Games, the maker of the Grand Theft Auto franchise. "An attacker will compromise a user’s credentials, and then pose as a vendor or IT employee and ask the user for a verification code or to approve an MFA prompt on their phone," says Jordan LaRose, practice director for infrastructure security at NCC Group. "Attackers will often use the information they’ve already compromised as part of the social engineering attack to lull users into a false sense of security."


Three Trends That Could Impact Data Management In 2023

Like cybercrime, the digitization of the customer experience is almost as old as the computer itself, but it really came into its own in the mobile age. What some call digitization 1.0 was all about “mobile, simplified design and new kinds of applications.” Digitization 2.0 homed in on customer demand—“apps anywhere, anytime, on any interface, and with any method of interaction: voice, social media, chat, texting, wearables, and even when you are sitting in your car.” What I’m calling digitization 3.0 here is a doubling down on both 1.0 and 2.0 to make data even more usable and provide unprecedented access to it. I began this article by highlighting the value of data. I think the companies that can extract even more value from their data while maintaining its security, resiliency and privacy will be the ones that not only survive today’s economic uncertainty but thrive during and after it. The key is contextualizing your data to make it more useful. This starts with the steps I’ve listed in the preceding two sections as a foundation.


Best and worst data breach responses highlight the do's and don'ts of IR

When it comes to data breaches, is there a sliding scale? In other words, if a tiny school district gets hit with a ransomware attack, do we give the IT team a partial pass because they probably lack the resources and skill level of a more tech-savvy company? On the other hand, if a company whose entire business model is based on protecting user passwords gets hacked, do we judge them more harshly? Which brings us to LastPass, which experienced an embarrassing breach that was first announced in August 2022 as simply a minor incident confined to the application development environment. By December that breach had spread to customer data including company names, end-user names, billing addresses, email addresses, telephone numbers, and IP addresses. LastPass gets high marks for transparency. The company continued to issue public updates following the initial August announcement. 


‘Digital twin’ tech is twice as great as the metaverse

A “digital twin” is not an inert model. It’s a personalized, individualized, dynamically evolving digital or virtual model of a physical system. It’s dynamic in the sense that everything that happens to the physical system also happens to the digital twin — repairs, upgrades, damage, aging, etc.Companies are already using “digital twins” for integration, testing, monitoring, simulation, predictive maintenance on bridges, buildings, wind farms, aircraft and factories. But these are still very early days in the “digital twin” realm. ... A digital twin system has three parts: The physical system, the virtual digital copy of that physical system and a communications channel linking the two. Increasingly, this communication is the relaying of sensor data from the physical system. It’s made from three major technology categories. If you imagine a Venn diagram of “metaverse” technologies in one circle, “IoT” in a second circle and “AI” in the third, “digital twin” technology occupies the overlapping center. Digital twins are different from models or simulations in that they are far more complex and extensive and change with incoming data from the physical twin....” 


Backup testing: The why, what, when and how

The aim of all testing is to ensure you can recover data. Those recoveries might be of individual files, volumes, particular datasets – associated with an application, for example, or even an entire site, or several. So, testing has to happen at differing levels of granularity to be effective. That means the differing levels of file, volume, site, and so on, as above. But it also means by workload and system type, such as archive, database, application, virtual machine or discrete systems. At the same time, the backup landscape in an organisation is subject to constant change, as new applications are brought online, and as the location of data changes. This is more the case than ever with the use of the cloud, as applications are developed in increasingly rapid cycles, and by novel methods of deployment such as containers. ... So, it’s likely that testing will take place at different levels of the organisation on a schedule that balances practicality with necessity and importance. Meanwhile, that testing must consider the constantly changing backup landscape.


Considerations for Developing Cybersecurity Awareness Training

Regular, ongoing cybersecurity awareness training is important, and the best time to start is during the new employee onboarding process. This sets the correct expectations in terms of what to do and what not to do before a new employee has access to the enterprise’s information assets or data. ... Enterprises may consider using classroom-based training (physical, virtual or a mixture of both) or a learning management system (LMS) to automate the delivery and tracking of cybersecurity awareness training. There are many online LMS providers, such as Absorb LMS and SAP Litmos, and they provide useful tools for creating online courses, quizzes and surveys. After online courses are created, an enterprise can use the LMS to organize and distribute online courses to its employees as needed. The LMS can also be used to monitor training progress, view analytics and allow employees to provide feedback in order for the enterprise to recalibrate its learning program for maximum impact.


AI and data privacy: protecting information in a new era

First of all, business technology leaders should consider whether they need AI and whether their problems can't be solved by more conventional methods. There is nothing worse than the "I want ML/AI solutions in my business, but I don't know what for yet" approach. To introduce AI you need to consider the entire architecture that will build, train and deploy models and consider how to collect and process large amounts of data. This requires assembling a good team, consisting of people such as data engineers, ML engineers and data scientists. It’s necessary to process large amounts of data and master many tools, so it is not as simple as writing a web application in a standard framework. Tech leaders should also be aware that AI comes with risks. They will need more and more computing resources to build increasingly sophisticated AI platforms. They will need to stay constantly abreast of news from the world of AI where everything changes rapidly, and it may turn out that in six months a much better solution or model for a particular problem has already been created.


Think carefully before considering cloud repatriation

It’s particularly difficult for smaller companies to repatriate, simply because, at their scale, the savings aren’t worth the effort. Why buy real estate and hardware and pay extra salaries only to save a small amount? By contrast, very large companies have the scale to repatriate, But do they want to? “Do Visa, American Express or Goldman Sachs want to be in the IT hardware business?” asks Sample, rhetorically. “Do they want to try to take a modest gain by moving far outside their competency?” Switching can also be complicated when the cost of change isn’t considered part of the calculation. A marginal run rate savings gained from pulling an application back on-prem may be offset by the cost of change, which includes disrupting the business and missing out on opportunities to do other things, such as upgrading the systems that help generate revenue. A major transition may also cause down time—sometimes planned and other times unplanned. “A seamless cutover is rarely possible when you’re moving back to private infrastructure,” says Sample. 


The High Costs of Going Cloud-Native

When it comes to reasons to move to the public cloud, “saving money” has long since been replaced with “increased agility.” Like the systems vendors they are in the process of displacing, AWS, Microsoft Azure, and Google Cloud have figured out how to maximize their customers’ infrastructure spend, with a little (actually a lot of) help from the unalterable economics of data gravity. ... Titled “Cloud-Native Development Report: The High Cost of Ownership,” the white paper tracks the journey of a hypothetical company as it embarks on a transition to cloud-native computing. ... While many of the technologies at the core of the cloud-native approach, such as Kubernetes, are free and open source, there are substantial costs incurred, predominantly around the infrastructure and the personnel. In particular, the costs of finding IT practitioners with experience in using and deploying cloud-native technologies was among the biggest hurdles identified by the report, which you can read here.


Security automation: 3 priorities for CIOs

To begin with, the CIO has choices to make about the testing approaches that will be deployed. Automation in AppSec can refer to tools and processes, ranging from automated vulnerability scanning (dynamic analysis) and static code analysis to software composition analysis and other types of security testing. The most advanced approaches can take things a step further by combining multiple forms of testing – perhaps augmenting DAST with interactive application security testing (IAST) and software composition analysis (SCA) – into a single scan for a comprehensive analysis of the organization’s security risk posture in a single frame. ... Meanwhile, in workflow terms, IT leaders should use customizable solutions to trigger scans at certain points in the development pipeline or based on a predefined schedule. This will allow CIOs and their teams to coordinate scans at specific times or in response to certain events like deploying new code or detecting a security incident.



Quote for the day:

"If you don't demonstrate leadership character, your skills and your results will be discounted, if not dismissed." -- Mark Miller

Daily Tech Digest - March 01, 2023

Intel Releases Quantum Software Development Kit Version 1.0 to Grow Developer Ecosystem

The SDK is a customizable and expandable platform providing greater flexibility when developing quantum applications. It also provides for users to compare compiler files, a standard feature in classical computing development, to discern how well an algorithm is optimized in the compiler. It allows users to see the source code and obtain lower levels of abstraction, gaining insight into how a system stores data. Additional features include: Code in familiar patterns - Intel has extended the industry-standard LLVM with quantum extensions and developed a quantum runtime environment that is modified for quantum computing, and the IQS provides a state-vector simulation of a universal quantum computer. Efficient execution of hybrid classical-quantum workflows - The compiler extensions allow developers to integrate results from quantum algorithms into their C++ project, opening the door to the feedback loops needed for hybrid quantum-classical algorithms like the quantum approximate optimization algorithm (QAOA) and quantum variational eigen-solver (VQE).


Day in the Life of a Chief Developer Experience Officer (CDXO)

According to Cauduro, the overarching goal is to put himself in the developer’s shoes—he constantly thinks about common developer workflows and considers how to create a seamless experience throughout the entire development life cycle. Next is spreading awareness throughout the company about DX principles and how to increase DX within their offerings. A CDXO will likely answer directly to executive leadership but might interface with many departments. A CDXO may direct teams to construct developer-specific tools, like libraries, documentation, SDKs and self-service environments. “DX is a mindset,” said Cauduro. “The whole company needs to be engaged in it.” “As with any C-level position, your job is on the one hand to make your team’s life easier in any way you can,” said Burazin. “And on the other to convey the developers’ issues or ideas to the company in hopes of nudging the company in the correct direction.”


ChatGPT vs GDPR – what AI chatbots mean for data privacy

As an open tool, the billions of data points ChatGPT is trained on are made accessible to malicious actors who could use this information to carry out any number of targeted attacks. One of the most concerning capabilities of ChatGPT is its potential to create realistic-sounding conversations for use in social engineering and phishing attacks, such as urging victims to click on malicious links, install malware, or give away sensitive information. The tool also opens up opportunities for more sophisticated impersonation attempts, in which the AI is instructed to imitate a victim’s colleague or family member in order to gain trust. Another attack vector might be to use machine learning to generate large volumes of automated, legitimate-looking messages to spam victims and steal personal and financial information. These kinds of attacks can be highly detrimental to businesses. For example, a payroll diversion Business Email Compromise (BEC) attack, composed of impersonation and social engineering tactics, can have huge financial, operational, and reputational consequences for an organisation 


‘Most web API flaws are missed by standard security tests’

APIs can become less of a liability by including security-focused team members during design, encouraging secure coding, conducting regular security tests, and monitoring programming calls for attacks and misuse. Securing web APIs requires a different approach to classic web application security, according to Ball. “Standard web application tests will result in false-negative findings for web APIs,” he explains. “Tools and techniques that are not calibrated specifically to web APIs will miss on nearly all of the common vulnerabilities.” A notable example is a vulnerability in the USPS Informed Visibility API, first reported by security researcher Brian Krebs. The web application was thoroughly tested one month before Krebs reported the data exposure. During testing, tools like Nessus and HP WebInspect were applied generically to the testing targets, and therefore a significant web API vulnerability went undetected. This undiscovered security flaw, in the USPS Informed Visibility API, allowed any authenticated user to obtain access to email addresses, usernames, package updates, mailing addresses, and phone numbers associated with 60 million customers.


Exploring biometrics within payments

It’s an obvious question but despite all the potential benefits of adopting biometric security, the technology still features several vulnerabilities and weak points. First, it cannot be relied upon for a fingerprint scanner or smartphone camera to be available at every transaction. While consumers can use biometric authorization on most mobile devices, desktops still make up a large portion of eCommerce sales. Additionally, companies will need to adopt hardware capable of reading and interpreting this data to accept biometric payments. The price of this hardware could be cost-prohibitive, depending on what is needed and how far a company wants to take contactless payments. Finally, we cannot forget the consumer factor. They are more anxious about their privacy and where personal data goes than ever before. Even if biometric scans do not actually save or store their biometric information, many consumers might still refuse to provide these identifiers.


Building resilience in a polycrisis world

Seeing and responding to risk differently first requires leaders to clearly pinpoint where plausible risks could materialize and do the most damage to key operations and services. This can be tricky if companies have traditionally approached risk in a siloed way. Company leaders should spend time with one another to work through what if? scenarios, with an eye toward highlighting where exactly in the business a problem or failure would be most catastrophic to customers. ... Now that the executives had their focus—the outcome of getting cash—they could begin looking at all the ways customers do so, including ATMs and the workers who service them, brick-and-mortar banks, and the tech and third parties that help with electronic transfer payments and build resilience across all functions, rather than focusing on individual mechanisms. Prioritization exercises also help leaders tease out false assumptions. Leaders at a UK housing management company had believed that collecting rents via the company’s app was the key to business continuity.


Field-Programmable Qubit Arrays: The Quantum Analog of FPGAs

FPQAs make quantum algorithms more resource-efficient by reducing qubit and gate overhead. The ability to quickly update the qubit layout and connectivity enables rapid testing, benchmarking and optimization of algorithms—in a way, delivering a customized computer for each calculation. One example of how FPQAs can be used to achieve better quantum computing performance is optimization. Many optimization problems can be described mathematically in terms of graphs. The nodes describe the variables in the optimization problem and the edges can represent various relationships between them. For instance, the nodes can describe the potential location of 5G towers, and edges describe pairs of towers that cannot be simultaneously operated without generating interference. In another, more abstract representation, each node can be a stock, and an edge between two nodes denotes that these stocks are correlated. These graphs can be mapped onto analog FPQAs by assigning each node to a qubit and setting the connectivity so that two qubits interact if the corresponding atoms have an edge.


CISA director urges tech industry to take responsibility for secure products

Accepting the continued use of unsafe technology products presents a greater risk to the nation than the Chinese spy balloon that was shot down off the coast of South Carolina and cannot be allowed to continue, Easterly said. “By design, we’ve normalized the fact that technology products are released to market with dozens, hundreds or thousands of defects — such poor construction would be unacceptable in any critical field,” she said during the address. The burden for cybersecurity has disproportionately been placed on consumers and small organizations who are least aware of the threats or able to protect themselves. Easterly said no one would be expected to go out and buy a car that lacked seat belts and air bags as standard features, and nobody should be expected to go out and pay additional money for secure technology products. Government can advance legislation to prevent technology companies from disclaiming liability by establishing higher standards of care, Easterly said.


Cybersecurity in wartime: how Ukraine's infosec community is coping

Defending organizations during an ongoing war put Cossack Labs' cybersecurity experts on an accelerated learning path, says Pilyankevich's colleague Anastasiia Voitova, head of customer solutions. "What I learned is that the priorities are very different from peacetime," she says. "The risks are different; the threats are very different. We have this real enemy. It's not textbook security. No. These are real issues, and we need to build real mitigation to these real issues." One could easily fall into the trap of creating systems that use the highest possible level of security, but Voitova believes this can be a mistake because a system that's too paranoid won't be usable. "This trade-off drama of how to balance security and usability, right now, can cost you even more because if you create a super secure system, but no one will use it, it will lead people to adopt insecure methods," she says. "And if insecure messages are intercepted, people might be injured." Such mistakes are more likely to occur as the war continues and users face prolonged stress and tiredness.


The CIO’s new C-suite mandate

Executives who used to stay in their own lane now find themselves needing closer alignment with one another to manage economic uncertainty, explosive growth, and digital and business transformations, and CIOs have become central figures as business strategists and changemakers. This new C-suite dynamic requires three big shifts to be successful, according to Dan Roberts, CEO of Ouellette & Associates Consulting. CIOs must change the narrative of their relationship with their counterparts, they must prepare their IT teams to deliver on the new narrative, and they must convince the C-suite to share the technology load. It’s a tall order for sure. “I would say just 10% to 15% [of C-suite relationships] are healthy and thriving and are in the trenches together with shared ownership and accountability,” Roberts says. But those CIOs who can look across the enterprise and find new ways to drive revenue or better orchestrate the customer experience and then can communicate and sell their vision to their C-suite counterparts are at the high end of the maturity curve, he adds.



Quote for the day:

"We get our power from the people we lead, not from our stars and our bars." -- J. Stanford

Daily Tech Digest - February 28, 2023

Does the Future of Work include Network as a Service (NaaS)?

NaaS enables companies to implement a network infrastructure that will evolve with time, providing the flexibility to adapt to business needs as time evolves. With NaaS, companies can focus on business outcomes and service level objectives for their network and the accessibility required for their community of workers, partners, and customers. NaaS eliminates organizations having to worry about keeping up with the pace of technology change by relying on the strength and expertise of their implementation partner. NaaS eliminates large upfront capital expenditure investments that often go into new network infrastructure design, planning, and implementation with a monthly subscription-based or flexible consumption model, alleviating the financial impact on rebuilding a new workplace environment. NaaS enables more flexibility by not tying the organization down to specific hardware or capital investments that may eventually become obsolete.


How Technical Debt Hampers Modernization Efforts for Organizations

“When you develop an application, you take certain shortcuts for which you're going to have to pay the price back later on,” explains Olivier Gaudin, cofounder and CEO of SonarSource, which develops open-source software for continuous code quality and security. “You accept that your code is not perfect. You know that it will have a certain cost when you come back to it later. It will be a bit more difficult to read, to maintain or to change.” ... Experts note the patience and long-term strategy required to overcome technical debt. “It’s a matter of focusing on longer-term strategy over short-term financial goals,” Orlandini says. “Unfortunately for Southwest, the issues were well-known. However, the business as a whole did not have the will or motivation to invest in fixing it until it was too late. They are an extreme example but serve as a very valid case in point of what can happen if you do not understand the issues and the ultimate repercussions of not investing to avoid a meltdown, in whatever form that would take for each organization.”


Just how big is this new generative AI? Think internet-level disruption

While resources and information availability increased by an unprecedented degree, so too did misinformation, scams, and criminal activity. One of the biggest problems with ChatGPT is that it presents completely wrong information as eloquently and confidently as it presents accurate information. Unless requested, it doesn't provide sources or cite where that information came from. Because it aggregates a tremendous amount of free-form information, it's often impossible to trace how it comes by its knowledge and assertions. This makes it ripe for corruption and gaming. At some point, AI designers will need to open their systems to the broader internet. When they do, oh boy, it's going to be rough. Today, there are entire industries dedicated to manipulating Google search results. I'm often required by clients to put my articles through software applications that weigh each word and phrase against how much Google oomph it produces, and then I'm asked to change what I write to appeal more to the Google algorithms.


Is blockchain really secure? Here are four pressing cyber threats you must consider

Blockchains use consensus protocols to reach agreement among participants when adding a new block. Since there is no central authority, consensus protocol vulnerabilities threaten to control a blockchain network and dictate its consensus decisions from various attack vectors, such as the majority (51%) and selfish mining attacks. ... The second threat is related to the exposure of sensitive and private data. Blockchains are transparent by design, and participants may share data that attackers can use to infer confidential or sensitive information. As a result, organizations must carefully evaluate their blockchain usage to ensure that only permitted data is shared without exposing any private or sensitive information. ... Attackers may compromise private keys to control participants’ accounts and associated assets by using classical information technology methods, such as phishing and dictionary attacks, or by exploiting vulnerabilities in blockchain clients’ software.


Behaviors To Avoid When Practicing Pair Programming

Despite its popularity, pair programming seems to be a methodology that is not wildly adopted by the industry. When it is, it might vary on what "pair" and "programming" means given a specific context. Sometimes pair programming is used in specific moments throughout the day of practitioners, as reported by Lauren Peate on the podcast Software Engineering Unlocked hosted by Michaela Greiler to fulfill specific tasks. But, in the XP, pair programming is the default approach to developing all the aspects of the software. Due to the variation and interpretation of what pair programming is, companies that adopt it might face some miss conceptions of how to practice it. Often, this is the root cause of having a poor experience while pairing.Lack of soft (social) skills ... The driver and navigator is the style that requires the pair to focus on a single problem at once. Therefore, the navigator is the one that should give support and question the driver's decisions to keep both in sync. When it does not happen, the collaboration session might suffer from a lack of interaction between the pair. 


When it comes to network innovation, we must protect the data ‘pipes’

We must conclude that any encrypted information collected by foreign intelligence services will eventually be cracked through sufficient compute power and time. This is one reason why super computers are part of the race for information dominance. At the level of supercomputers, the amount of compute is truly calculated in cost to build and cost to operate. If you do not have access to cutting edge chips, just increase the number of compute chips, central processing unit or graphics processing unit, or some other compute unit like an AI accelerator. It will cost more to make and cost more electricity to operate, but the amount of compute will be available to the government or corporation that invested in the system. Without a true “zero trust” scheme, any compromise of any node on any network becomes a pivot point for further attacks. The problem with “zero trust” is that to be effective, you need a mature network model that can be secured, not a “growing, organic network” that is adapting rapidly to meet the needs of the user.


Unstructured data and the storage it needs

As we’ve seen, unstructured data is more or less defined by the fact it is not created by use of a database. It may be the case that more structure is applied to unstructured data later in its life, but then it becomes something else. ... It’s quite possible to build adequately performing file and object storage on-site using spinning disk. At the capacities needed, HDD is often the most economic option.But advances in flash manufacturing have led to high-capacity solid state storage becoming available, and storage array makers have started to use it in file and object storage-capable hardware. This is QLC – quad-level cell – flash. This packs in four levels of binary switches to flash cells to provide higher storage density and so lower cost per GB than any other flash commercially usable currently. The trade-offs that come with QLC, however, are that flash lifetime can be compromised, so it’s better suited to large-capacity, less frequently accessed data. But the speed of flash is particularly well-suited to unstructured use cases, such as in analytics where rapid processing and therefore I/O is needed


The Cybersecurity Hype Cycle of ChatGPT and Synthetic Media

Historically, spearphishing messages have been partially or entirely crafted by people. However, synthetic chat makes it possible to automate this process – and highly advanced synthetic chat, like ChatGPT, makes these messages seem just as, or more convincing, than a human-written message. It also opens the door for automated, interactive malicious communications. With this in mind, threat actors can quickly and cheaply massify high-cost and highly effective approaches like spearphishing. These capabilities could be used to support cybercrime, nation-state operations and more. Advances like ChatGPT may also have a meaningful impact on information operations, which have come to the forefront due to foreign influence in recent US presidential elections. Technologies such as ChatGPT can generate lengthy, realistic content supporting divisive narratives, which could help scale up information operations.


How to de-risk your digital ecosystem

In short, in any de-risking framework, one must assume that the largest source of cyberthreats comes not from someone breaking in, but rather from a door left open for an uninvited guest. Organizations must adapt their mindset, their processes, and their resources accordingly. ... In many organizations, the responsibility for closing risk gaps falls to several leaders, but not to a single point of authority. The failure is understandable as digital ecosystems touch multiple dimensions of an enterprise. But then responsibility for the total risk environment and de-risking is shared — though not necessarily met. A lack of accountability results in a lack of power to act and set de-risking as a priority within the organization. ... Without understanding the context of the business, understanding and remediating risk is difficult to do effectively. For example, an outside vendor can be a potential source of risk but also plays a critical and central role in the business. Resolving and mitigating the issue may require special handling and attention.


Closing the Cybersecurity Talent Gap

Cybersecurity is often viewed as just another technical talent field, yet candidates are expected to possess a wide range of rapidly evolving knowledge and skills. When filling staffing gaps, leaders should examine the skill sets that are missing from their current team, such as creative problem solving, stakeholder communications, buy-in development, and change enablement. “Look for candidates who will help balance out existing team skills as opposed to individuals who match a specific technical qualification,” Glair says. Before hiring can begin, it's necessary to attract suitable candidates. Initial search steps should include website updates and social media posts, Glair says. He also suggests creating an internal “cybersecurity academy” that will build talent from within the organization. “This should include the technical, process, communications, and leadership skills needed to address today’s cybersecurity challenges,” Glair notes. Burnet recommends sponsoring a “sourcing jam.” “That means getting recruiters and/or hiring managers in a room together ... to trawl through their networks and get them to personally reach out.”



Quote for the day:

"Leaders are the ones who keep faith with the past, keep step with the present, and keep the promise to posterity." -- Harold J. Seymour

Daily Tech Digest - February 27, 2023

Embrace and extend Excel for AI data prep

Google has come out with a Chrome extension called GPT for Sheets, which allows users to manipulate data with conversational language; Microsoft says it will integrate ChatGPT into all of its products, with Bing first. Microsoft recently invested $10 billion in OpenAI, the creators of ChatGPT. But as exciting (and sometimes disappointing) as ChatGPT applications may be, there’s a much more mundane—and promising—approach to machine learning that’s already available. ... This is the technical process of converting data from one format, standard, or structure to another, without changing the content of the data sets, in order to prepare it for consumption by a machine learning model. Data prep is the equivalent of janitorial work, albeit incredibly important work. Transformation increases the efficiency of business and analytic processes, and it enables businesses to make better data-driven decisions. But it’s difficult and time-consuming unless the user is familiar with Python or the popular query language SQL.


Digital forensics and incident response: The most common DFIR incidents

SOCs already make use of automation as much as possible, as they need to deal with telemetry, but automation for digital forensics is different, as it mostly needs data processing by orchestrating, performing and monitoring forensic workflows. Half of DFIR professionals indicate that investments in automation would be greatly valuable for a range of DFIR functions, as workflows still rely too much upon the manual execution of many repetitive tasks. More than 20% of the survey respondents indicated automation would be mostly valuable for the remote acquisition of target endpoints, the triage of target endpoints, and processing of digital evidence, as well as documenting, summarizing and reporting on incidents. ... A field under such rapid evolution needs informed and decisive leadership to set strategies and direct resources in an efficient way. Leaders influence the way DFIR professionals can efficiently access data sources they need, which is often difficult, as more than a third of the survey respondents indicated.


DDoS Attacks Becoming More Potent, Shorter in Duration

Microsoft says TCP reflected amplification attacks are becoming more prevalent and powerful, and more diverse types of reflectors and attack vectors are typically exploiting "improper TCK stack implementation in middleboxes, such as firewalls and deep packet inspection devices." In reflection attacks, attackers spoof the IP address of the target to send a request to a reflector, such as an open server or middlebox, which responds to the target, such as a virtual machine. The latest TCP reflected amplification attacks can reach "infinite amplification" in some cases. In April 2022, a reflected amplified SYN+ACK attack on an Azure resource in Asia reached 30 million packets per second and lasted 15 seconds. "Attack throughput was not very high, however there were 900 reflectors involved, each with retransmissions, resulting in high pps rate that can bring down the host and other network infrastructure," the report says.


How the Economic Downturn Has Affected Security Funding, M&A

"The first thing that happens when you go into a down economic cycle is: Everybody goes on defense," Ackerman says. "They rationalize the platform, make sure it's stable and right-size for the market. Once that foundation is established, then they go on offense. I think you're going to see an acceleration of M&A activity by the big guys as they get through this consolidation and rationalization process." DeWalt expects industrial control systems and OT security to get lots of attention from the investment community in 2023 given the technology's lack of penetration and volume of attacks against industrial, non-IT networks. Network and infrastructure security had the fifth-highest level of M&A and financing activity in 2022, including a $125 million Series C funding round for critical infrastructure firm Fortress. DeWalt says the Russia-Ukraine war has led to increased attention on data management as data wipers, data poisoning and the poisoning of AI algorithms become ways to foment misinformation and disinformation.


Yes, Virginia, ChatGPT Can Be Used to Write Phishing Emails

Script kiddies in particular have been asking if ChatGPT might help them build better malware for free. Results have been extremely mixed. "Right now, I think it's a novelty," says John Kindervag, creator of zero trust and senior vice president of cybersecurity strategy at ON2IT Group. But as AI gets better, he says, "probably it will allow the attackers to craft more sophisticated attacks, and it will toast everybody who is not paying attention." So far, at least, the fervor over AI chatbots being used to build a better cybercrime mousetrap is claptrap, says security researcher Marcus Hutchins, aka MalwareTech. ... Criminals needn't bother to use AI chatbots, which are trained on publicly available code. Instead, they can go to the source. "If someone with zero coding ability wants malware, there are thousands of ready-to-go examples available on Google" and GitHub, Hutchins says. Another rising concern is that criminals will use AI chatbots to craft better phishing email lures, especially outside their native language.


The Evolution of APIs: From RESTful to Event-Driven

Synchronous microservice limitations can be overcome through asynchronous interaction, event-driven architecture, and event-enabling traditional microservices. Taking advantage of the constant flow of business and technical events by acting on them promptly. As awareness of the importance of events and event-driven architecture (EDA) grows, architects and developers are exploring ways to integrate events into microservices. However, successful adoption of EDA also requires a change in mindset and approach from business stakeholders, product owners, and architects. This shift involves moving from a data-centric approach to one that uses events to drive business decisions and logic. Full event-native adoption is necessary to fully leverage the benefits of events throughout the various stages of the business. Modern APIs are predominantly based on microservices, but events and event-driven architecture (EDA) are becoming increasingly important. The future of APIs lies in combining the strengths of APIs and EDA to create Event-Driven-APIs.


Scotland launches data strategy for health and social care

Carol Sinclair, chair of the Scottish government’s data board for health and social care, said in the strategy’s foreword that the aim is to “empower citizens and staff” through ensuring data supports the delivery of health and social care services. “Public trust and the ethical use of data for public good is central to this strategy,” she said. “We are working alongside colleagues across government to ensure the principles of Open Government are followed as we define and publish key, ethically sound and publicly trusted principles to support the unlocking of the social and economic value associated with the use of public sector personal data in the service of the people of Scotland.” For health and social care staff, the strategy aims to improve discoverability, accessibility, interoperability and reusability, making it easier to access data across organisations. The Scottish government is already working on the replacement of the Community Health Index (CHI) system, a platform which has been in place since the 1970s. 


How to Have More Effective Conversations With Business Stakeholders About Software Architecture

One big barrier to effective conversations about technical decisions that need business input is that development teams and business stakeholders speak different languages. To be more precise, when techies talk about technology, business people tune out. What they need to do is frame discussions in terms of how technical choices will affect business outcomes - something business stakeholders do care about. ... The metaphor helps, to a point, but an even better approach is to describe how addressing a technical issue will enable the organization to achieve a business outcome that they could not otherwise do. Or how not addressing a technical issue will impair the business outcomes that the organization can achieve. This conversation works both ways. When there is a major shift in business priorities, such as the organization deciding to exit a specific market, or needing to respond to regulatory mandates, describing the shift in terms of business outcomes helps everyone understand the impact.


6 precautions for CIOs to consider about ChatGPT

“The success of ChatGPT in a consumer capacity is clear. And since its language model is effectively trained on all the text on the web, the output is so fluent that it can be challenging, if not impossible, to decipher whether the author is human or a bot. But in a higher-stakes enterprise context, the fluency of the response creates a false sense of trust for the user: It looks right. It sounds right. But if you run the numbers, it might be wrong. When you’re trying to make your friends laugh, accuracy doesn’t matter at all. But when trying to make a decision that could impact lives and livelihoods, that unreliability can have catastrophic consequences. In a sense, ChatGPT is similar to Google Translate. While Google Translate is great for travel and other casual use cases, an enterprise won’t trust it to faithfully translate their marketing materials for new geographies or their contracts for international agreements. The stakes are just too high to gamble on a statistical model. Successful applications will require organizations to train and fine-tune a model like ChatGPT on proprietary enterprise information to help it interpret and produce the “language” used within that organization.


'New Class of Bugs' in Apple Devices Opens the Door to Complete Takeover

The findings are another puncture wound in the perception that Apple devices are somehow inherently more secure than PCs or Android devices. "Since the first version of iOS on the original iPhone," Emmitt explained, "Apple has enforced careful restrictions on the software that can run on their mobile devices." The devices do this with code signing. Functioning somewhat like a bouncer at a club, iPhone only allows an application to run if it has been cryptographically signed by a trusted developer. If any entity — a developer, hacker, etc. — wishes to run code on the machine, but they're not "on the list," they'll be shut out. And "as macOS has continually adopted more features of iOS," Emmitt noted, "it has also come to enforce code signing more strictly." As a result of its strict policies, Apple has earned a reputation in some corners for being particularly cyber secure. Yet that extra stringency can only extend so far. "I think that there is a misconception when it comes to Apple devices," says Mike Burch, director of application security for Security Journey. 



Quote for the day:

"To command is to serve : nothing more and nothing less." -- Andre Marlaux

Daily Tech Digest - February 26, 2023

Skills-based hiring continues to rise as degree requirements fade

Hard skills, such as cybersecurity and software development, are still in peak demand, but organizations are finding soft skills can be just as importanr, according to Jamie Kohn, research director in the Gartner Research’s human resources practice. Soft skills, which are often innate, include adaptability, leadership, communications, creativity, problem solving or critical thinking, good interpersonal skills and the ability to collaborate with others. “Also, people don’t learn all their [hard] skills at college,” Kohn said. “They haven’t for some time, but there’s definitely a surge in self-taught skills or taking online courses. You may have a history major who’s a great programmer. That’s not at all unusual anymore. Companies that don’t consider that are missing out by requiring specific degrees.” ... Much of the recent shift to skills-based hiring is due to the dearth of tech talent created by the Great Resignation and a growin number of digital transformation projects. While the US unemployment rate hovers around 3.5%, in technology fields, it’s less than half that (1.5%).


Do digital IDs work? The nation with one card for everything

No system is foolproof: a potential security flaw uncovered in 2017 required the temporary suspension of 800,000 cards. Users occasionally receive notifications that some data may have been compromised. Nor does it eliminate fraud. According to official figures, Estonians were scammed out of €5 million (£4.4 million) last year, through a familiar mixture of fraudulent calls, phishing messages and fake websites in which they were tricked into handing over passwords. But in two decades no one has decrypted the technology itself. Indeed, the system has become so embedded in Estonian life it is hard to find anyone who knocks it. Its few critics are mostly to be found among the ranks of the populist right-wing Conservative People’s Party of Estonia (Ekre), which is expected to take third place in the election. The party has expressed scepticism about the security of online voting — and its supporters are less likely to use it than supporters of mainstream parties. But although it spent two years in a coalition government from 2019, Ekre did nothing to dismantle the system.


Microsoft trains ChatGPT to control robots

The team plans to leverage the platform's ability to develop coherent and grammatically correct responses to various prompts and questions and see if ChatGPT can think beyond the text and reason about the physical world to help with robotics tasks. "We want to help people interact with robots more easily, without needing to learn complex programming languages or details about robotic systems." The key obstacle in the way for a language model based on AI is to solve problems considering the laws of physics, the context of the operating environment, and how the robot’s physical actions can change the state of the world. Even though ChatGPT can do a lot alone, it still needs some help. Microsoft has released a series of design principles, including unique prompting structures, high-level APIs, and human feedback via text. These models can be used to guide language models toward solving robotics tasks. The firm is also introducing PromptCraft, an open-source platform where anyone can "share examples of prompting strategies for different robotics categories."


Job Interview Advice for Junior Developers

Remember the audience you are talking to. Human resources personnel are very different to the dev guy sitting in the interview with them. You won’t be working with anyone in HR, but they are the gatekeepers. In smaller outfits, there will probably be a manager type and a dev guy. It doesn’t hurt to empathize with how their views differ. Make sure you are comfortable with all the parts of the agile development cycle, not just the bits your team happened to pick up in your last project. I have never used pull requests, but they are de rigueur in may places. Similarly code reviews. Retrospectives might seem pointless to you, but how else do you improve a process? Don’t assume that what your team called DevOps or Kanban is exactly the same as what everyone else does. One of the biggest snags in an interview — I think the most common — is when the interviewee shows little enthusiasm for an aspect of development the interviewer happens to thinks is important. Your negative hot take on a mainstream process, especially without experience weighing in behind you, may well be seen as a sign of rigidity.


Cyberattacks hit data centers to steal information from global companies

Resecurity identified several actors on the dark web, potentially originating from Asia, who during the course of the campaign managed to access customer records and exfiltrate them from one or multiple databases related to specific applications and systems used by several data center organizations. In at least one of the cases, initial access was likely gained via a vulnerable helpdesk or ticket management module that was integrated with other applications and systems, which allowed the threat actor to perform a lateral movement. The threat actor was able to extract a list of CCTV cameras with associated video stream identifiers used to monitor data center environments, as well as credential information related to data center IT staff and customers, Resecurity said. Once the credentials were collected, the actor performed active probing to collect information about representatives of the enterprise customers who manage operations at the data center, lists of purchased services, and deployed equipment.


Avoiding vendor lock-in and the data gravity trap

Today, enterprises are adopting hybrid cloud and multicloud strategies to avoid vendor lock-in and the data gravity trap. In fact, many enterprises are choosing vendors who provide cloud-agnostic services that are also open source to benefit from the most freedom and to avoid vendor lock-in. In addition, working with vendors with large partner ecosystems can help reduce the risks of lock-in. Open standards are the antidote to proprietary technology. Open standards allow users to move freely between vendors—to mix and match or integrate—with competing vendors to create their own solution. They allow you to compose your service or system freely and liberate you from the proprietary interfaces of a vendor. Open standards came about from our prior experiences of vendor lock-in. If we forget that history, we are condemned to repeat it. ... While the general movement is toward the concept of “composable IT,” where software-defined infrastructure and application components interoperate seamlessly to make businesses nimble, there are times when vendor lock-in makes sense.


Boards tapping CIOs to drive strategy

“CIOs are playing a leading role in orchestrating transformation and are stepping up in response to the changing industry dynamics,” said Logicalis CEO Bob Bailkoksi, “yet they are faced with challenges to navigate, including a potential recession and talent shortages.” “In addition to this, they are experiencing increased pressure to deliver digital-based outcomes for their organisations, giving them more exposure to their boards and requiring a different way of operating.” In many companies, that “different way” seems to be that their traditional remit – implementing technology to support business strategy – has changed dramatically: 41 per cent reported having some level of responsibility for business strategy as well as the technology to deliver it, and 80 per cent said business strategy will become a bigger part of their role in the next two years. Innovation and digital transformation are two of four primary areas of focus for CIOs identified in the study – the others being strategy and the reimagination of service partnerships, which is expected to see three-quarters of CIOs spending more on IT outsource management this year than last.


How Companies Are Using Data to Optimize Manufacturing Operations

Real-time agility requires combining data from multiple sources to create new insights for use cases like machine learning. Manufacturers are familiar with streaming data today, but this is nowhere near the point of saturation. Everything from supply chain shortages to COVID-19 sick time and weather disruptions has made the simple task of getting shipments from point A to B complicated and uncertain. However, companies that consolidate data from asset-based sensors, predictive maintenance algorithms, and ordering and staffing systems (such as enterprise resource management, supply chain management and human resources software), can use it to respond much more quickly, and keep assets operating at maximum efficiency. Industrial asset management, especially in transportation, allows companies to showcase the value of real- or near-real-time data in a range of scenarios, from airlines to railroads. Large-scale assets deployed in the past few years generate a tremendous amount of streaming data. 


3 ways to retool your architecture to reduce costs

When you have one team developing and operating one application deployed on one server, it's not difficult to figure out the infrastructure, labor, and software costs of developing the application. However, when you have thousands of applications, it becomes a real mess to trace every penny flowing into your application's TCO. ... The reality is far more complicated. For platforms, especially hybrid cloud, some costs may go away immediately, while some will be redistributed to new application portfolios. The labor costs associated with the application will remain unless you actually eliminate that labor; otherwise, those costs will be redistributed across the other applications the team supports. In other words, labor costs will increase for those other applications. ... There are two views of measuring excess capacity. First, to decrease unused reservations by application owners, you must allocate total platform cost by total reservations. However, for platform owners to understand excess capacity compared to the total built capacity, you must allocate the total platform cost by the total built capacity.


4 steps to supercharge IT leaders

IT leaders understand well that it’s not just willpower that leads to success; it’s also “way power,” or the “how” of achieving your goals. Being able to find a pathway forward is just the starting point. Leaders today must be able to pivot in real time when a new opportunity, obstacle, or crisis surfaces. The more options you have, the more likely you are to achieve your goals. Consider the example of a complex digital initiative. Let’s say you’ve identified a way forward plus a backup plan for your top purposes. Implementation will almost certainly be nonlinear. What happens when you hit obstacles? ... Many things cloud or limit our vision or even blindside us. We are all affected by our personalities, beliefs, and assumptions. We are also affected by the data that we choose to rely on. How can you be more confident that what you’re seeing is real? Start by taking another look at your external priorities. You might be exaggerating or discounting the opportunities or threats that you see. ... How confident are you about timing for essential deliverables on your critical path? 



Quote for the day:

"The speed of the leader is the speed of the gang." -- Mary Kay Ash

Daily Tech Digest - February 25, 2023

Beyond Chess and the Art of Enterprise Architecture

In EA orthodoxy, the common future state goal is also seen as contributing to the coherence of design choices. As this is clearly not enough, architecture principles have been espoused as a way to make future design choices ‘good’. Architecture principles, are like simple tactical ‘gameplay rules’ in chess. My favourite chess gameplay rule example is that one can give each piece in chess a weight (queen is 9, castle is 5, bishop and knight is 3 and pawn is 1) and that if you exchange material then you should take the exchange when you win more points than you lose. That is the chess equivalence of an architecture principle: it defines ‘the choice you should make’. Following sych a rule blindly is a sure way to lose a game of chess, though. Simply focusing on getting more material, will not make you win, you might not even end up with more material. But what is true is that winners do in general end up with more pieces than losers. This tactical gameplay rule of chess are thus descriptive and not prescriptive. 


Why Are My Employees Integrating With So Many Unsanctioned SaaS Apps?

It's in the interest of the vendors to get users hooked quickly on any cool, new functionality by removing all friction to adoption, including bypassing IT and security team reviews in the process. The hope is that even if security teams grow wise to the use of an application, it will prove too popular with business users and too critical to business operations to remove it. However, making adoption overly easy can also lead to a proliferation of unused, abandoned, and exposed apps. Once an app is rejected during a proof of concept (PoC), is abandoned due to waning interest, or the app owner leaves the organization, it can often remain active, providing an expanded and unguarded attack surface that places the organization and data at elevated risk. While it's important to educate your business users on SaaS security best practices, it's even more important to fight indiscriminate SaaS sprawl by teaching them to evaluate more critically the siren song of SaaS vendors about easy deployment and financial incentives.


Meta’s New Feature will Make You Rethink Your Online Presence

Digital identity is touted to be the most essential component in how we will operate in the social media world, or the metaverse, in years to come. “Subscription identity verification could be even more important in the metaverse where it is critical to know that the person you are interacting with is precisely who they claim to be,” Louis Rosenberg told AIM. During the 2023 World Economic Forum (WEF), the argument in favour of the proposition was overwhelmingly affirmative, and it went as follows: “To achieve this frictionless state, good system-wide interoperability of the metaverse should consider interests such as privacy, security and safety. Given the borderless nature of the metaverse, multistakeholder and multilateral collaboration will be required to reach consensus on design choices, best practices, standards and management activities.” The multiple stakeholders mentioned above include not only creators and users, but also governments, businesses and civil society.


As Microsoft embraces AI, it says sayonara to the metaverse

First, if you’ve got big metaverse plans involving Microsoft-related technologies, it’s time to re-examine what you’re doing. Microsoft will support its metaverse only around the edges, meaning to a certain extent, you’re on your own. Take that into account in deciding your next steps. It also means you should be careful about buying too much or too quickly into Microsoft’s AI promises. Yes, the company appears to be going all in on the technology, with billions of dollars in investments and high-profile pronouncements about its use in Bing. It claims AI will essentially be built into everything it does from now on. There’s no doubt AI will affect much of what the company does in the years ahead. But remember, it was just two years ago that Nadella could not “overstate how much of a breakthrough” the metaverse was. And now Microsoft has all but abandoned it. Be cautious when it comes to AI. You need to invest in it, but don’t go all in until it’s clear Microsoft will itself be all in for many years to come.


Don’t let buzzwords drive your cloud architecture

Why am I bringing it up now if it’s a known and long-time problem? I see BOA drive most cloud architectures these days, with enterprises paying the price for expensive mistakes. As I covered here, these architectures “work,” but because they function at a much lower level of cost efficiency, enterprises typically spend two to three times more than a better-optimized solution. Just look at the misapplications of containers to see many examples of this expensive problem. ... Cloud computing didn’t fail the business; those who created the cloud solutions failed the business. Instead of finding the most cost-effective and optimized cloud architecture, they took a BOA approach that started with answers before there was a clear understanding of the questions. I’ve created many IT architecture concepts and many buzzwords in my career. The danger comes when those concepts and buzzwords are misapplied and we mistakenly blame the concept, not the person who misapplied it.


How Can Quantum Entanglement Be Used For Secure Communication?

There is little doubt that quantum technology will have a huge impact on cybersecurity and cryptography. Already, we can see government agencies across the globe preparing enterprises for “Q-Day”, a time when quantum computers will be able to use Shor’s algorithm to break all public key systems that use integer factorization-based cryptography. As a procedure, quantum communication entails encoding information in quantum states, called qubits, rather than in the classical binary tradition of “zeros and ones”, taking advantage of the special properties of these quantum states to guarantee security. Most commonly, photons are used for this. Quantum Key Distribution (QKD) is a proven theoretical security that is future-proof yet requires trusted nodes for long distances. Presently, a single QKD link is limited to a few 100 km with a sweet spot in the 20–50 km range. Work on quantum repeaters and satellite-QKD is ongoing to extend the range. 


Data governance holds the key to integrity and security

Data governance provides the principles, policies, processes, framework, tools, and metrics required to manage data at all levels, from creation to consumption, effectively. When implemented, data governance establishes a pathway for integrity by having effective quality management and in meeting compliance standards in all applicable privacy and security measures. Through such features, data is made trustworthy, which is key in any IT transformation. ... In an age of increasing complexity in all aspects of data technology and architecture, federal agencies and military IT departments need to simplify their approach to data governance to succeed. The ideas discussed here can simplify and significantly improve any data governance initiative. By leveraging data virtualization technology, agencies can take advantage of these ideas and transform their approaches to data management and data governance to achieve more efficient operations and secure data access.


For Effective Governance, Start with Why

It’s critical to take a collaborative approach when creating program standards, so enlisting the support from colleagues in departments like legal, HR, cybersecurity, risk management, and safety will aid in buy-in and deconfliction, and you might be able to even borrow similar style, terminology, and processes from them. After all, few of us can claim our standards, procedures and governance products are truly proprietary. They are usually blends of other products from various organizations which can apply best for our industry, organization, location, and culture. Consult codified standards from organizations like ISO, ANSI, NIST, and ASIS. Benchmark drafts with peer organizations. ... Further, you will likely be sharing risk with various departments, so it’s critical to understand existing jurisdictions. Finally, to have any status, this document should be reviewed and approved by senior leadership. There will be challenges to your authority and you will need support to defend against this.


These Experts Are Racing To Protect AI From Hackers

Concerns about attacks on AI are far from new but there is now a growing understanding of how deep-learning algorithms can be tricked by making slight -- but imperceptible -- changes, leading to a misclassification of what the algorithm is examining. "Think of the AI system as a box that makes an input and then outputs some decision or some information," says Desmond Higham, professor of numerical analysis at University of Edinburgh's School of Mathematics. "The aim of the attack is to make a small change to the input, which causes a big change to the output." ... This recognition process isn't an error; it happened because humans specifically tampered with the image to fool the algorithm -- a tactic that is known as an adversarial attack. "This isn't just a random perturbation; this imperceptible change wasn't chosen at random. It's been chosen incredibly carefully, in a way that causes the worst possible outcome," warns Higham.


Security with ChatGPT: What Happens When AI Meets Your API?

Right now, the brightest minds in cybersecurity are envisioning how to employ better AI and machine learning (ML) security. Such as CyberArk researchers, who recently discovered how to easily trick ChatGPT into creating polymorphic, malicious malware. At the heart of AI’s cybersecurity concerns is the proliferation of APIs (application programming interfaces). While developers are working to simplify and accelerate architecting, configuring and building serverless applications with the help of new AI systems like DeepMind’s AlphaCode, problems arise as ML becomes responsible for generating and executing code. ... So far, with OpenAI’s ChatGPT, the results look good and may even work well. That said, many results aren’t perfect and could incorporate flaws that aren’t evident upon initial review. Whether the coder is an AI system or a human, organizations still need a strong approach to application security that will catch vulnerabilities in code and provide suggestions on how to remediate them.



Quote for the day:

"Give away what you most wish to receive." -- Robin Sharma