Daily Tech Digest - October 20, 2022

Three Ingredients of Innovative Data Governance

The first important feature of innovative data governance is providing a data set that is statistically similar to the real data set without exposing private or confidential data. This can be accomplished using synthetic data. Synthetic data is created using real data to seed a process that can then generate data that appears real but is not. Variational autoencoders (VAEs), generative adversarial networks (GANs), and real-world simulation create data that can provide a basis for experimentation without leaking real data and exposing the organization to untenable risk. VAEs are neural networks composed of encoders and decoders. During the encoding process, the data is transformed in such a way that its feature set is compressed. During this compression, features are transformed and combined, removing the details of the original data. During the decoding process, the compression of the feature set is reversed, resulting in a data set that is like the original data but different. The purpose of this process is to identify a set of encoders and decoders that generate output data that is not directly attributable to the initial data source.


Where is Blockchain Headed?

The buildup of blockchain has led inevitably to some growing pains as the scope of the space and its big data needs amp up. “We’re trying to exist with the consequences of scale,” Hoskinson said. “We have this homogenous model that everybody is their own blockchain, everybody has a full copy. This is the idea Bitcoin brought.” He raised doubts about the continued feasibility of such a model as usage escalates exponentially. “If you have millions of users year after year, you end up having blockchains that are in the petabytes scale or the yottabytes scale,” Hoskinson said. “So how do you preserve this concept of inclusive accountability, meaning you don’t have a full copy of the blockchain, yet you’re able when an event happens to verify it with the same trust model as if you did?” He sees advancements brewing in zero-knowledge cryptography, where no extra information is shared during an interaction via blockchain to maintain privacy. Other areas of research and development of blockchain technology include getting blockchain to work on smartphones without compromising integrity, he said


4 Stakeholders Critical to Addressing the Cybersecurity Workforce Gap

HR professionals should de-emphasize the importance of certifications for interns and junior and midcareer professionals and instead focus on on-the-job training and clearly defined career paths for cybersecurity professionals. The issue with certifications has existed since before CSIS in 2010 observed, "It is the consensus of the Commission that the current professional certification regime is not merely inadequate; it creates a dangerously false sense of security." ... Compliance professionals should be aware that their security counterparts are continuously overextended and seek to automate as many compliance operations as feasible. When responding to an internal assessment or an external audit, compliance professionals regularly rely on the security team to collect evidence of internal control operation and effectiveness. Realistically, this is an "extra" job duty on the part of security professionals, and as such, these tasks may be done in a rush or put off to the last minute, due to the more pressing duties on their limited time. 


Our Law Firm Is Too Small to Be in Danger From Cyberattacks – Wrong!

You may be small, but you still hold the valuable data of many people and small businesses. If you are easy to break into, and criminals armed with your data can then attack your clients, that’s a very good payday for them. And just like all businesses, most cybercriminal groups are happy with a certain level of profit — and many of them wish to avoid the increased attention and scrutiny from law enforcement and governments that result from attacks on ‘big game’ firms. ... The long-term costs of a data breach last for months to years and very often they involve significant expenses that entities are not even thinking about or anticipating in their planning. What might this include? Lost or inaccessible data, business disruption, revenue losses from operational downtime, breach notification costs, legal liability costs and reputational damage. We would add the significant legal costs of hiring a data breach lawyer to oversee the breach response and a digital forensics company to investigate and remediate the breach.


Cross-Functional Collaboration is Becoming Essential for Security Professionals

Security is part of digital trust but just one piece of the equation. Attaining digital trust is a group project that cannot be done in silos, but unfortunately, too many companies are not yet in that mindset. In ISACA’s recent State of Digital Trust 2022 study, only 12% of respondents strongly agree that there is sufficient collaboration among professionals who work in digital trust fields. In many companies, the organizational culture reinforces a stay-in-your-lane mentality, and there’s not much sustained commitment for working cross-functionally. That needs to change, and security teams should do their part to drive toward the broader mission of advancing digital trust. There are many security professionals who are adept at specific areas: endpoint security, identity and access management and threat hunting, etc. There is certainly a place for specialization, but it is important that the team includes people who recognize the importance of sharing knowledge, learning from other teams and finding areas of overlap to approach transformation projects with a holistic approach.


5G Innovations and Cybersecurity Risk

Although there are many benefits of 5G, like any emerging technology it also introduces diverse threats, and numerous facets of 5G security risk are still undergoing research. For example, 5G’s accessibility and multitudinous devices accelerate the likelihood of successful denial-of-service (DoS) attacks, saturation attacks, eavesdropping and spoofing attacks. 5G’s extensible architecture and focus on speed could diminish concerns for adequate security controls to prevent potential attacks. The National Cybersecurity Center of Excellence collaborates with vendors such as AT&T, Cisco, Intel and Palo Alto Networks and has published 5G Security Guidelines for public consultation. In addition, the European Union Agency for Cybersecurity (ENISA) has published a few 5G security-related documents. To establish a comprehensive approach for 5G cybersecurity, organizations can leverage frameworks from the International Organization for Standardization (ISO), the National Institute of Standards and Technology (NIST) and the Center for Internet Security (CIS). Experts advise that enterprises acquire intrusion detection and prevention systems to distinguish attacks.


Digital Banking Transformation Requires Back-Office Evolution

Becoming a future-ready digital banking organization requires a complete rethinking of all back-office operations. “Organizations should look for the processes that are tedious, time consuming, and have the greatest impact on both the efficiency of the organization and the customer experience,” advises Jim Perry, senior strategist at Market Insights. Automating as many ‘behind the glass’ components as possible will translate into more freedom to innovate and drive smarter decision making. Automation can lead to more fulfilling and meaningful work for existing and new employees, eliminating the fear of becoming obsolete in a digital world. At a time of low unemployment, the retention of existing employees is crucial. “The next step is using machine learning and artificial intelligence to take on more complex tasks that today require a broader range of human judgment,” states Nicols. This is also an area where collaboration with external solution providers can provide a significant advantage.


How CFOs Will Tackle Challenges Heading into 2023

Abhi Maheshwari, CFO at Aisera, says he believes it's important to get a complete 360-degree view of the buyer journey. “This can be achieved only when all key functions of the business are driving efforts towards the common end goal of building a strong pipeline and closing more business for the company,” he says. Whatever marketing technology sales technology or revenue operations technology can help him achieve this 360-degree view in our CRM is something he fully supports and backs. “Eventually, it's all about deriving business value and ROI from investing in tech that scales as you scale your business,” he adds. ... This requires an honest assessment of the problem that is being addressed and what the expected and acceptable range of outcomes would be. “Broadly speaking, a good framework would be for the CFO to breakdown the components of what exactly makes up ROI, cost, return, and risk and see if the investment reduces cost, increases returns, and/or reduces risk,” he explains.


Gartner: Hero CIOs are needed again to save businesses

Lovelock said that before the cloud, CIOs had far more flexibility in how they managed their IT budgets. “Pre-cloud, CIOs had a lot of autonomy within their IT budget,” he said. “There are now much less discretionary pieces of the budget.” For instance, if a CIO wanted to swap out an existing on-premise general ledger application, it was possible for the organisation to cut the annual maintenance fee associated with that application in preparation for the migration to the new platform. But, as Lovelock pointed out, pay-per-use subscription models reduce this flexibility. “If you are on a cloud general ledger, the minute you stop paying that bill, you stop having a general ledger,” he said. For Lovelock, rising costs are increasingly becoming a factor in the public cloud. He said: “For the first time, we have seen some pretty big increases in the prices of some cloud services. Their costs are going up. Their energy costs are going up and their server costs have gone up. Their staff costs are also going up, so they need to pass on at least some of this to clients.”


Gartner: IT matters more than ever to attract and keep the best talent

Technology is the new epicenter of corporate culture, and organizations that experiment openly with technologies that are still considered early in the Gartner Hype Cycle—the company’s depiction of technologies move from conception to widespread adoption—will attract talent candidates. “Here’s why,” Nunno said. “First, because barriers to innovation create friction. Second, organizations that innovate during tough times, stay ahead of the pack; and enterprises who innovate publicly attract the best new pack members. Now is the time to find room in your budgets to be bold.” The way organizations are handling the hybrid-work environment is a good example of how things are changing. Nunno said many CIOs have told Gartner that their world was easier when employees were all in the office or all out of the office. “Many executives are concerned that hybrid work is damaging corporate culture, so they're trying to turn back the clock by mandating employee time in physical offices,” Nunno said.



Quote for the day:

"All organizations are perfectly designed to get the results they are now getting. If we want different results, we must change the way we do things." -- Tom Northup

Daily Tech Digest - October 19, 2022

Small businesses need more help with tech. Here are five ways to get it

While some small firms might to look to hire an IT director on a temporary basis, McCabe says most will want to avoid bringing in a costly consultant. "There's a sliver of venture capital-backed tech companies with a CIO or someone with an equivalent title, but not the vast majority of small firms," she says. For SMB owners and managers who want technology expertise without high fees, McCabe suggests a different route. "I'm a big fan of industry associations and regional technology councils. They can be really great because people in these organisations are in businesses like yours," she says. ... "The challenge for smaller organisations is developing brand and trust," says Bev White, CEO of Nash Squared. "Where there are so many players, how can you stand out from the crowd when few people might know who you are?" Her firm's research suggests twice as many SMBs (23%) as larger corporates (10%) are extremely or very effective at scaling good ideas and stopping poor ideas quickly. SMBs should be on the lookout for novel ways to source technological solutions to business challenges, so go to conferences, attend meetups and take part in specialist events.


Platform Engineering: What Is It and Who Does It?

One issue when adopting platform engineering is the tendency to build another silo. A good example would be a ticketing system where users can request features or report bugs, the requests go into the platform engineering realm, and are eventually resolved. You can combat this by focusing on enabling users to self-serve their own needs with your portal by providing accurate and relevant documentation, training sessions and pairing with users to solve their problems. Another issue is prioritizing the right things. There are a lot of users from many different parts of your organization, so having a single feature request pipeline for those things users cannot self-serve with a committee deciding on priority is essential to servicing the needs of your organization effectively. Keep your platform team adaptable and not stuck in the past ways of doing things. With the rapid pace of change in IT, it’s hard to keep up. Enablement is one way you can ease the burden on your team, but also allowing your team a consistent amount of time to train on new technologies is another.


Making SBOMs Actionable

There is no doubt that SBOMs should be requested from your software vendors and that you should consider creating SBOMs along with your own developed software. It’s all about the proper storage of the SBOMs so you can be sure they’re recent, searchable and trustworthy and tamper-proof. The benefits and use cases for SBOMs are numerous; they vary across stakeholders who produce, choose and operate software and are amplified when combined. Use cases for SBOMs include better software development, supply chain management, vulnerability management, asset management and high assurance processes. The benefits include reducing cost, mitigating security risk, license risk and compliance risk. But the key is making the SBOM actionable. No developer, no software maintainer or DevOps engineer wants to manually collect the dependencies and produce SBOM documents. It needs to be fully automated within the software build and deployment pipeline and there needs to be a proactive check of where it’s currently running.


How We Built Testability with Psychological Safety

Simply throwing people together and expecting them to figure out how to work together like this will most likely result in failure, but that's the point. You want them to fail, not to stop and go back to how things were, but to figure out why they failed. You want them to talk about what is and isn't working and what they can do differently. The problem leadership needs to help these people overcome is the assumption that high-performing people and teams don't fail. So we will do our best to avoid failure at the first signs of it. Leadership must show that failure is a natural by-product of experimentation and that high performers produce and share their failures; not avoid, deny, ignore or distort them, but learn from them. But for people to embrace failure like this, they need high levels of psychological safety, meaning team members can take interpersonal risks and be vulnerable by sharing what they don't know, what they don't understand or mistakes they have made without fear of judgement or that it will affect their prospects negatively.
 

Announcing open innovations for a new era of systems design

The root of trust is an essential part of future systems. Google has a tradition of making contributions for transparent and best in-class security, including our OpenTitan discrete security solutions on consumer devices. We are looking ahead to future innovations in confidential computing and varied use-cases that require chip-level attestation at the level of a package or System on a Chip (SoC). Together with other industry leaders, AMD, Microsoft, and NVIDIA, we are contributing Caliptra, a re-usable IP block for root of trust measurement, to OCP. In the coming months we will roll out initial code for the community to collectively harden together. ... To address the challenges of reliability at scale, we’ve formed a new server-component resilience workstream at OCP, along with AMD, ARM, Intel, Meta, Microsoft, and NVIDIA. Through this workstream, we’ll develop consistent metrics about silent data errors and corruptions for the broader industry to track. We’ll also contribute test execution frameworks and suites, and provide access to test environments with faulty devices.


Is Reinforcement Learning Still Relevant?

Autonomous machine intelligence is the common goal in both these approaches, but with reinforcement training there is always a human agent driving the working of the machine, while unsupervised learning proposes to learn from observation. Self-supervised learning advocates talk about the inefficiency of trial-and-error methods but uncertainty still remains a major barrier for self-supervised learning. Sergey Levine from Berkeley AI Research recently proposed a solution of combining self-supervised learning with offline-reinforcement learning, that explores the possibility of enabling models to understand the world without supervision and allow reinforcement learning to explore causal understanding of the world, thus expanding the dataset close to infinite. Yann LeCun proposed the World Model in paper in June 2022, which uses a “cost module” in its architecture that measures the energy-cost of an action by the machine. When reinforcement learning is scaled on larger datasets, the reward maximisation also needs further scaling.


Services You Should Expect From Your Cyber Insurance Provider

Plenty of insurers expect their clients to form their own partnerships. Even if that is not the case, it is advisable to form relationships with security and incident response firms and establish a solid perimeter from the outset. This is particularly true for smaller companies who do not have the resources to support dedicated internal staff. Doing so may even reduce insurance premiums. “It should start at the company level,” suggests Pankaj Goyal, senior vice president of data science and cyber insurance for cyber security firm Safe Security. “How do you think about cyber risks? What are the gaps? What is the financial risk? How much can you mitigate by investing in cyber budgets or cybersecurity products? And then how much risk do you need to transfer?” “The onus is on the client to make sure that they bring in the right expertise. That expertise can be around assessing the risk itself, understanding the gaps, understanding the risks, and figuring out what improvements can be made,” Goyal maintains.


How Web3 security will differ from Web2 security for businesses

Web2 has fundamentally altered both the web and the sectors that sustain it. With just one click, users may now collect, create, and distribute enormous volumes of data and the Web as we know it today was made possible with the addition of user-generated content and dynamic, interactive websites through Web2. Web3’s expansive acceptability and usage are correlated with those of blockchain, cryptocurrency and cyber security. Blockchains for cryptocurrencies are open ledgers that keep track of and validate all transactions inside a network. Everyone has access to transactions, the involved pseudonymous addresses, and the amounts of money transmitted. Users of blockchain-based systems like Web3 are recognised by their public key and blockchain address, making them pseudonymous. Compared to independent websites hosted on a single server, Web3 uses the immutable ledger of the blockchain to store data. This means that Web3’s use of the blockchain offers more resilience, protection against censorship, and other benefits.


Digital transformation: Why observability is critical

Observability tooling has become critical on the road to digital transformation. As DevOps and cloud, the key enablers of digital transformation, guide us towards ever more federated and distributed processes and systems, incidents become more unpredictable, and observability is our best chance for assuring availability. IT professionals have always dealt with change, but never at the speed of our current digital transformation. ... “Think of the origins of the word ‘observability.’ According to the Oxford dictionary, ‘observe: notice or perceive (something) and register it as being significant.’ The ability to ‘observe’ an application and its related elements provides a proactive opportunity to optimize the data, logs, metrics, traces, etc. in order to predict its performance in real time and contribute to root cause analysis when necessary. Observability not only allows for post-production insight into performance; it requires an intention to build observability into the design and development of applications and infrastructure.


CALMS Is DevOps for Cloud Engineering

In comes cloud engineering, which, Stratton contends, applies standard software engineering practices and tools across application development, infrastructure and compliance — usually just the first — pursuant to leveraging the cloud effectively. His talk offered how to combine the basics of DevOps’ CALMS — culture, automation, lean, measurement, and sharing — with the foundation of cloud engineering — build, deploy, manage — all with a cloud native mindset. Let’s dive into his breakdown now. ... The manage side of cloud engineering comes down to creating this level of visibility across a development and deployment cycle, with a common vocabulary that connects to business objectives. At this stage, Stratton explains that security is everyone’s job, not just shifting that responsibility on new shoulders. Cloud engineering aims to put controls and process in place to enable, enhance and automate as much as possible, especially around security and compliance, taking the blame away from the individual. 



Quote for the day:

"A leader takes people where they would never go on their own." -- Hans Finzel

Daily Tech Digest - October 18, 2022

How the European Energy Crisis Could Impact IT

Enterprises with internal, inefficient data centers will be the organizations most affected by the power crisis. “Enterprises that have already sourced or have moved to the cloud will be less impacted, although they will not escape some cost challenges,” Hall says. “Energy costs are going up across the board, so you can expect those costs to be passed on to customers through existing agreements.” With energy becoming increasingly scarce and expensive, many European enterprises are turning to hyper-scalers, an agile method of processing data via remote data centers that are equipped with horizontally linked servers. Hall forecasts a bigger push to cloud computing in the months ahead, especially toward major hyper-scaler providers -- including Amazon AWS, Microsoft Azure, Google GCP, Alibaba Cloud, IBM, and Oracle -- which tend to offer both lower costs and reduced carbon emissions. “Given the complexity of transitioning workloads, though, we are concerned clients will pull back on technology spend for lower-priority activities,” he says.


A “Green” Quantum Sensor

Zhu and colleagues take a different approach by developing a quantum sensor that generates its own power from a renewable energy source, in this case solar energy. The team’s sensor is made from an ensemble of NV centers in diamond, a well-established solid-state quantum-sensing platform that can operate over a wide range of temperatures (0–600 K), pressures (up to 40 GPa), and magnetic fields (0–12 T). Nitrogen-vacancy centers are defects that are typically created by implanting nitrogen ions into a diamond lattice. The centers confine charge carriers—such as electrons or holes—creating a localized electronic state. Users can read out the spin of this state by exciting the defect with a laser. The NV center then emits radiation, via fluorescence, whose intensity correlates with the system’s spin. Researchers typically use a green laser for this excitation, as that color of light produces the strongest fluorescence in the system (the emitted radiation is red). For use in quantum applications, NV centers are ideal because they operate at room temperature, so no cooling apparatus is required.


How to Achieve API Governance

To scale your developer (i.e., user) experience, you need to look at the bigger picture “the entire API landscape” and not just a single API. Not only will you have different consumers accessing one API, but they will mix and match different APIs to build their own experiences. These experiences will be built by the designers on the consumer side; they will combine the APIs in a way that makes sense for their users. The main question to ask here is “How do you plan for such a scenario?” This is where API governance and scaling come into play, because now we are not only looking at what is the best way to manage that one API but we are looking at the best way to manage the entire API landscape, so that we can scale those experiences much better. The more we design and optimize the experience for the entire API landscape, the more we can create a better UX that translates to more value generated. One crucial thing to keep in mind is that, in the end, even though we are striving to get the best UX, that’s not all that matters.


7 critical steps to defend the healthcare sector against cyber threats

Internet of Things (IoT)-enabled equipment has been hugely beneficial in enabling healthcare providers to automate and facilitate remote working. But if not properly monitored and patched, these connected devices can also provide threat actors with an easy attack path. Hospitals are likely to have hundreds of devices deployed across their facilities, so keeping them all updated and patched can be an extremely resource-heavy task. Many health providers also struggle to accommodate the required downtime to update vital equipment. Automating device discovery and update processes will make it easier to keep devices secured. Providers should also vet future purchases to ensure they have key security functionality and are accessible for maintenance and updates. Healthcare providers sit in the center of extremely large and complex supply networks. Suppliers for medical materials, consultants, hardware, and facilities maintenance are just a few examples, alongside a growing number of digital services.


Gartner reveals top strategic tech trends for CIOs to watch in 2023

Observable data reflects the digitised artifacts — such as logs, traces, API calls, downloads and file transfers — that appear when any stakeholder takes any kind of action. Applied observability feeds these observable artifacts back to users in a highly orchestrated, integrated approach to accelerate organisational decision-making. ... With AI-related privacy breaches and security incidents becoming more frequent, organisations will need to implement new capabilities to ensure model reliability, trustworthiness, security and data protection. AI trust, risk and security management (TRiSM) requires participants from different business units to work together to implement new measures. ... Digital immune systems implement data-driven insight into operations — including automated and extreme testing and software engineering — to increase the resilience and stability of systems. This emerging capability can help provide a roadmap that CIOs can use to plan out new practices and approaches that their teams can adopt to deliver higher business value, while mitigating risk and increasing customer satisfaction.


Digital transformation: 4 paths to becoming future-ready

Modernizing and transforming while recognizing that many legacy systems serve critical business needs that can’t be seriously disrupted isn’t a new tension. It was a common enough theme in 2015 for terms like bi-modal IT (coined by Gartner), 2nd/3rd platform (IDC), and fast/slow IT to be in common circulation. The basic idea was that you might want to modernize (or not) traditional IT while freeing new cloud and container technologies from having to deal with legacy entanglements. A similar approach to digital transformation – with similar pluses and minuses – is in play with this pathway. The authors write that the motivation for this approach is when senior leaders “believe transforming their current firm will take too long and will require a very different culture, skills, and systems than exist today.” They point to similar organizational hurdles that bimodal IT critics pointed to: The cool new organization gets all the attention and focus while the traditional organization slowly trudges along.


How to turbocharge collaboration in innovation ecosystems

Fail fast, learn fast, succeed faster. These precepts, which emanated from Silicon Valley tech labs, have taken the world of open innovation by storm. Executives who are eager to get to the bottom line of evaluating an innovative idea, product, or technology will likely be told, “Wait for the retrospective!” Even in the most traditional, non-digital-native corporations, innovative collaborations have borrowed heavily from methods such as agile technology and lean startup. From ideating and producing a proof of concept, innovation partners will typically work in sprints, testing and prototyping until they have come up with a minimum viable product. In the meantime, successful pilots may lead to other pilots and create spillover effects. Few best practices or approved scripts are available to follow. Instead, it is crucial to test and experiment with different ways of finding solutions to a problem. Although the search for solutions is typically driven by customer needs, optimizing the internal process can also produce results.


Guilty verdict in the Uber breach case makes personal liability real for CISOs

Going forward, CSOs and CISOs may be at odds with their senior and peer groups of executives when a strategic decision is made that places the company at risk, even a mitigated risk. As every CSO/CISO knows, there is no such thing as 100% secure. Has this verdict opened a door for victims of a corporate data breach to not only go after the company with which they had entrusted their information, but also the executives who shoulder that responsibility? Whether this is a welcome turn of events or a shock to the system will play out in the coming months as legal teams of companies that hold personal data evaluate their positions in the light of this verdict. Another question that must be discussed in corporate C-suites is just how far down the executive chain of responsibility should the corporate liability insurance coverage extend and what guidance is coming out of human resources and legal to their executives about personal liability and their need to obtain personal liability insurance.


How to Tackle Cyberthreats in the Metaverse

“This is not a new issue to the metaverse, as people have dealt with conversational integrity since the dawn of social interaction on the internet,” he says. However, with more and more social and workplace interaction taking place in places known as a metaverse, there is a new level of awareness required to ensure you are actually speaking with the individual you think you are speaking with. John Bambenek, principal threat hunter at Netenrich, a security and operations analytics SaaS company, agrees, noting almost all cybersecurity threats start or are furthered by deception of an individual. “Ultimately, I think most crime on the metaverse will surround deception towards individuals,” he says. “Romance scams entail huge financial losses but are almost completely disregarded when companies consider cybersecurity risks.” He explains for most social media companies, ensuring that individuals truly exist (i.e. are not bots) and authentic (i.e. not scammers running 20 accounts) will remain a problem.


New Data Leaks Add to Australia's Data Security Reckoning

In what may be a world-first, the Australian government also pressed Optus to reimburse people for fees incurred related to replacing their passports and driver's licenses. For passports, those eligible must pay for the replacement upfront and then apply for reimbursement from Optus. Optus will apply a credit to customers' bills to cover the cost of replacement driver's licenses, depending on the state or territory. Some states and territories are initially waiving the cost of replacement due to the breach. Optus provides more information here. The government's pressure on Optus to reimburse those affected by the breach is striking and could send a message of increasing intolerance for data breaches and a desire to increase the immediate costs for those responsible for breaches. Consumers often wait years to see any compensation from class action lawsuits as a result of a breach.



Quote for the day:

"Problem-solving leaders have one thing in common: a faith that there's always a better way." -- Gerald M. Weinberg

Daily Tech Digest - October 17, 2022

Get ready for the metaverse

“The metaverse presents an opportunity to more fully transcend our physical limitations,” says Anand Srivatsa, CEO of Tobii. “Technologies like eye tracking will play a critical role in helping reduce the need for compute and networking power, which are required to deliver lifelike, immersive virtual environments. Eye tracking will also help users express their attention and intent in more realistic ways when they’re in the digital universe.” ... If human-digital devices enable the experience, and infrastructure supports metaverse-scale interactivity, then it’s how real the experience feels to users that will be the primary innovation and differentiator. To start, organizations will need strong dataops capabilities, and machine learning models will likely require synthetic data generation. Zuk continues, “Businesses looking to make waves in the metaverse usually begin by establishing a robust data pipeline—with synthetic data as the primary resource driving the development life cycle.” Bart Schouw, chief evangelist at Software AG, agrees.


Cybercriminals are having it easy with phishing-as-a-service

Phishing-as-a-service is a fairly new phenomenon, this trend is where the cybercriminal actually takes the role of a service provider, carrying out attacks for others instead of just for themselves in exchange for a sum of money. PaaS only serves to show how hackers are becoming better organized and looking for greater monetisation from ransomware. Instead of threat actors being required to have technical knowledge of building or taking over infrastructure to host a phishing kit (login page emulating known login interfaces like Facebook/Amazon/Netflix/OWA), the barrier to entry is significantly lowered with the introduction of PaaS. ... Phishing-as-a-service can be very advanced, with capabilities spanning from detecting sandbox environments, to fingerprinting user agents in order to determine whether you might be a researchers bot. That being said, Web Content Filters can often limit the exposure of users.


Top 5 Data Science Trends That Will Dominate 2023

Automation plays a significant role in transforming the world. It has stimulated various transformations in business, resulting in sustained proficiency. In the past few years, the best automation capabilities have been provided by the industrialisation of big data analytics. The process of Analytic Process Automation (APA) encourages growth by providing prescriptive and predictive abilities along with other insights to businesses. Through this, businesses have been able to receive excellence with efficient results and low costs. Analytic Process Automation mainly enhances computing power to make good right decisions. Data analytics automation can be considered a perfect disruptive force. Big data analysis helps substantially with stimulating valuable data usage and productivity. ... Data Governance handles data access all over the world. General Data Protection Regulation (GDPR) compliance has various organizations and businesses that prioritize data governance and handles the data of consumers.


Code Red: the Business Impact of Code Quality

The main problem with technical debt is that code lacks visibility. Code is an abstract concept that isn’t accessible to all members of your organization. Hence, it’s easy to ignore technical debt even if we are aware of the general problem. Quantifying and visualizing the situation in your codebase is key, both for the engineering teams as well as for product and management. Visualisations are wonderful as they let us tap into the most powerful pattern detector that we have in the known universe: the human brain. I explored the concept at depth in Your Code as a Crime Scene, and founded CodeScene back in 2015 to make the techniques available to a general audience. ... With code health and hotspots covered, we have everything we need for taking it full circle. Without a quantifiable business impact, it’s hard to make the case for investing in technical debt paydowns. Any measures we use risk being dismissed as vanity metrics while the code continues to deteriorate. We don’t want that to happen.
Those at the cutting edge of ML are increasingly turning to synthetic data to circumvent the numerous constraints of original or real-world data. For instance, company Synthesis AI offers a cloud-based generation platform that delivers millions of perfectly labeled and diverse images of artificial people. Synthesis AI has been able to accomplish many challenges that come with the messy reality of original data. For a start, the company makes the data cheaper. ... The challenges of real-world data don’t end there. In some fields, huge historical bias pollutes data sets. This is how we end up with global tech behemoths running into hot water because their algorithms don’t recognize black faces properly. Even now, with ML technology experts acutely aware of the bias issue, it can be challenging to collate a real-world dataset entirely free of bias. Even if a real-world dataset can account for all of the above challenges, which in reality is hard to imagine, data models need to be improved and tweaked constantly to stay unbiased and avoid degradation over time. That means a constant need for fresh data.


Improve Developer Experience to Prevent Burnout

It’s obvious that a poor developer experience creates a negative impact throughout an entire company. If developers aren’t producing good work due to unhappiness, illness or burnout, it’s likely that organizations aren’t staying at the cutting edge or offering competitive products in the market. A demoralized team can have a really negative business impact, and it can even change the way that people outside the company feel about it. An unhappy team isn’t going to lead to much creativity or productivity. As a way to combat this growing trend, companies are looking left and right for solutions. Some companies are reaching for things like extra PTO days, a full month off, better benefits, pay raises, and more fun work culture or relaxed dress codes. Those things are nice to have, and we’re certainly not speaking ill of any organization trying something new to help their employees. But at the end of the day, if the overwork and unrealistic expectations remain, the developer burnout will remain too.


Top skill-building resources and advice for CISOs

Ultimately, the hiring organisations will define what it needs in terms of cybersecurity to find the right person. In finance and insurance, for example, there will be specific rules that must be followed in different countries and cybersecurity leaders in such organisations may even be liable. In telecommunications, the skills required are likely to be more technical, whereas in government knowledge around governance and risk are top of the list. “For instance, a smaller organisation which is a greenfield site, or a large multinational where there is already an established security function require different sets of skills and approaches,” Joseph Head, director technical security at Intaso tells CSO. “There are a few commonalities between all CISO roles, however: an understanding of risk and risk appetite — in other words, an understanding of the business, and how much risk it can carry. This dictates how much work a CISO must do, and therefore available budget. Unlocking that budget can only be done by communicating effectively.”


Startup promises SD-WAN service with MPLS reliability, less complexity

Graphiant says that what makes its service different from SD-WAN offerings is how its Stateless Graphiant Core handles WAN data and control planes. The company says many large enterprises have been unwilling to give up the SLAs that come with MPLS for mission-critical traffic. Thus, SD-WAN augments the MPLS network for lower-priority traffic, and the network team must manage two different networks. The operational and administrative overhead of the combined solution, along with the complexity of overlays, tunnels, and policy management means that many enterprises are turning back to MPLS providers that offer their own SD-WAN or that resell others’. That way, the enterprises can order to relieve the burden of managing a complicated managed service themselves. “Enterprise networks have transitioned from predictable topologies to unpredictable ones,” Raza says. He argues that cloud services, IoT, work from home, and a range of other pressures have pushed the MPLS-plus-SD-WAN formula to its breaking point.


High-trust workplace meets no-trust network security

Clearly, the traditional model for IT security is no longer fit for this newly-dispersed world of work and a fresh model is needed — one where the unit of control is identity and where identity is the basis of a system of authorisation and authentication for every device, service and user on your network. Welcome to zero trust, a system which works on the assumption that identity needs to be authenticated and authorised. Given the shift to high-trust digital working environments and the surge in attacks, interest in zero trust is growing. According to Gartner, 40 percent of remote access will be conducted using a zero trust model by 2024 — up from five percent in 2020. Remote work is driving uptake, with zero trust seen as a fast way to achieve security and compliance, according to a Microsoft report on its adoption. Zero trust is implemented through consistent tools, workflows and processes delivered as a set of shared, centrally-managed and automated services. What does this look like? It means codifying policies and procedures for authorisation and access across the technology stacks, domains and service providers that comprise the IT infrastructure.


IT leadership: How to defeat burnout

What sets Liberty Mutual apart from other organizations is our purpose. We exist to help people embrace today and confidently pursue tomorrow. This is our North Star and helps define and guide everything we do. We also understand that combating burnout requires connecting work to outcome. To ensure that this happens, we spend time defining targeted outcomes – the realization of the expected benefit – versus output – for example, simply turning on a new feature in a system. Success is measured by producing results and realizing benefits. Outcome might be the ability to deploy capabilities faster than before, for example. The key word is ‘capabilities,’ which help us deliver better products and services to customers. An outcome is much bigger than an output such as simply turning on a technology. These nuances matter in the context of burnout. If you’re working on a project and you don’t know why you’re doing it or what the intended results are, you’re not connected to why it matters.



Quote for the day:

"Brilliant strategy is the best route to desirable ends with available means." -- Max McKeown

Daily Tech Digest - October 16, 2022

Top AI investors reveal State of AI in 2022

What’s new in 2022, and what made Benaich and Hogarth dedicate an entire section to AI safety, is the other end of AI safety. This is what Hogarth referred to as AI alignment: ensuring that an extremely powerful and superintelligent AI system doesn’t ever go rogue and start treating humanity badly in aggregate. The 2022 State of AI report is very much biased toward that end of safety because, according to Hogarth, the topic is not receiving enough attention. “We’re seeing exponential gain in capabilities, exponential use of compute, exponential data being fed into these [AI] models,” Hogarth said. “And yet we have no idea how to solve the alignment problem yet.” It’s still an unsolved technical problem where there are no clear solutions, he added: “That’s what alarms me — and I think that the thing that is probably the most alarming about all of it is that the feedback loops now are so violent. You have huge wealth creation happening in AI. So there’s more and more money flowing into making these models more powerful.”


Interview with Vinayak Godse, CEO of Data Security Council of India

We see ourselves as an important catalyst in the National Cybersecurity initiatives, especially in terms of the technology geopolitics that is now hitting up; take the US–China tech war, for example. Cyber security in today’s age and day has become pivotal because the coming decade is going to be driven by technology, and cyber security is one fundamental area which will be driving all of these transitions. As per NASSCOM’s TECHADE 2020: Digital Tech Opportunities report, AI/ML, cloud, and cybersecurity will be crucial and critical for this decade. But, how will that happen? There are many different parts to this. Firstly, security should enable the growth of the industry. We aim to prepare the industry, the society, the individuals and, most importantly, the economy against possible issues and challenges regarding privacy. This is the second part. With technologies such as AI/ML, Data Analytics, and VR/AR gaining prominence, we will work towards solving the security problem in relation to these emerging technologies.


How Can Business DataOps Drive Growth?

DataOps is a fast-expanding area of expertise. Data analytics and operations specialists eager to learn how to develop and oversee DataOps procedures will have a successful future. They have the opportunity to guide the following group of data teams and set the bar for data practices for at least the ensuing ten years. Additionally, a creative, quickly expanding organization that reduces laborious and repetitive business activities will have happier and more motivated employees. The time it takes to develop a concept into something valuable is crucial to businesses. Through the use of agile development methodologies, DataOps shortens lead times. Additionally, the interval between rounds is shortened. Additionally, producing and dispersing solutions in tiny pieces enables solutions to be applied gradually. Shadow IT may form in businesses that use a sluggish development strategy for data solutions. Other departments create their concepts without the IT department’s approval or involvement.


Message Routing and Topics: A Thought Shift

There is one thing that caught my attention and fascination simultaneously. The central theme of a real-time enterprise integration is message routing. Almost all messaging systems employ an intelligent, interest-based subscription mechanism that ensures the routing of messages to interested parties. The intelligent part of routing is built with rules around the Event type (name) and the content. Routing decisions based on an Event type name are straightforward because they are easily accessible and available without unpacking the payload. However, a routing based on filtering rules on the content would require unpacking and evaluating the rules to determine a match. This directly impacts message throughput and performance. Content-based message routing (CBR) is performance-penalizing and not the best choice for real-time messaging scenarios where latency costs good decisions. In the early days of integration, content-based routing was considered essential. 


Where Quantum Entanglement Is Actually Being Used

Quantum entanglement is a critical element of quantum information processing, and photonic entanglement of the type pioneered by the Nobel laureates is crucial for transmitting quantum information. Quantum entanglement can be used to build large-scale quantum communications networks. On a path toward long-distance quantum networks, Jian-Wei Pan, one of Zeilinger’s former students, and colleagues demonstrated entanglement distribution to two locations separated by 1,203 km on Earth via satellite transmission. However, direct transmission rates of quantum information are limited due to loss, meaning too many photons get absorbed by matter in transit so not enough reach the destination. Entanglement is critical for solving this roadblock, through the nascent technology of quantum repeaters. An important milestone for early quantum repeaters, called entanglement swapping, was demonstrated by Zeilinger and colleagues in 1998. Entanglement swapping links one each of two pairs of entangled photons, thereby entangling the two initially independent photons, which can be far apart from each other.


Virtual Panel: The New US-EU Data Privacy Framework

Given the Court of Justice of the European Union's (CJEU) stance regarding US surveillance law, it is not clear how GDPR can be made compatible with transatlantic data transmission. Thus it is likely that any new privacy frameworks will be challenged in courts. Yet, the newly proposed Trans-Atlantic Data Privacy Framework brings an attempt to solve the underlying issues and may include an independent Data Protection Review Court as a mechanism to solve disputes that could provide an effective solution. If the new framework did not pass European Courts' scrutiny, it is possible that a completely different approach to data privacy will be required in future to ensure data transmission and collaboration while granting privacy rights, such as treating user data as a currency or similarly to copyright. In this virtual panel, three knowledgeable experts in the field of data privacy discuss where the existing agreements fall short, whether a new privacy agreement could improve transatlantic data sharing while granting privacy rights for EU citizens and stronger oversight of US intelligence, and more.


Distributed Ledger Technology (DLT): The Solution to the Age of Digital Distrust?

DLT has more going for it than blockchain technology alone. Alternatives are available across the DLT spectrum that already solve the so-called blockchain trilemma: guaranteeing high security, scalability, and decentralization. While a blockchain works with a linked list of blocks, IOTA, for example, uses the ‘Tangle,’ an acyclic fabric of mutually linked transactions that maintain the global shared state of the ledger while boosting speed. In addition, IOTA also avoids classic transaction fees, which would be prohibitive for applications in the IoT area, through an alternative consensus algorithm based on the reputation of the nodes. ... What makes DLT so exciting and relevant is that it was conceived and developed for this decentralized digital world where trust is at a premium. It’s not simply a case of storing the information safely that creates trust. It’s also how it’s created and continuously ensured between all the different partners of a business process. DLT determine the conditions under which nodes of the decentralized infrastructure capture and record new transactions and when they do not.


Digital innovation and the future of financial services

The growth of Web3 technologies also offers new opportunities, as the internet evolves from read-only pages to more interactive and immersive experiences. “Web1 is traditional data. Web2 added social [interaction] as data. And in Web3, everything is data,” said Dr Booth. In Web3, that data has real value to its users, and underlying blockchain technology allows it to be transferred and monetised easily. The most recognisable example may be buying digital ‘land’ in the metaverse. Transactions that take weeks in the real world can be completed in seconds and recorded on a secure, immutable blockchain. “How we connect the metaverse to the real-verse is going to be where banks will play a role,” said Mr Williamson. The digitalisation of financial services will create virtual mountains of new and complex data, generated from disparate sources and stored in different locations. Yet when everything's digital, that kind of volume and complexity becomes manageable. From Dr Booth’s point of view, AI is the connective digital tissue holding everything together.


How to Prove the ROI of Your Enterprise Architecture Efforts

The ROI of EA can be felt at the highest levels of the organization, but it can also have an impact at the individual department level. Cultivating this impact involves both research into specific challenges different departments face and educating department heads on what EA can do. For example, legal teams may not know that enterprise architecture has a critical role to play when it comes to navigating compliance standards and regulations. Similarly, marketing and sales teams may not realize how EA can support data management to drive analytics and personalization efforts. Every corporate function today depends on technology to be effective. EA is all about better, more strategic uses of technology. It thus falls to EAs to evangelize their capabilities across the enterprise, seeking out often unexpected opportunities to improve operations and outcomes, department by department. Getting users from across your organization on your EAM tool is no small undertaking. 


How to turbocharge collaboration in innovation ecosystems

Handled in the right way, collaborations will align and connect potential co-innovators to a shared purpose—internally as well as externally. Importantly, value creation is no longer just a numbers game. Echoing the motto of former PepsiCo CEO Indra Nooyi, “performance with purpose,” impactful innovation in an ecosystem is likewise driven by social values as much as by numbers. These are reflected in the challenges and specific problems the collaborators seek to address; in reframing the purpose, if necessary, so that it fits everyone’s objectives; and in the way the collaboration defines success. Much as in a team sport, egos and titles are swept aside in pursuit of a greater goal. ... Disruption, the digital revolution, covid-19—in aggregate, these factors have blurred and, in some cases, dissolved the boundaries between organizations, segments, and entire industries. As a result, innovation ecosystems are emerging as the dominant paradigm for corporate innovation. Yet, because of the fundamental disparities embedded in their structures, ecosystems are difficult to form and initiate, let alone sustain.



Quote for the day:

"Be a Strong Leader, Even If You Follow a Weak Leader" -- Miles Anthony Smith