Daily Tech Digest - October 18, 2022

How the European Energy Crisis Could Impact IT

Enterprises with internal, inefficient data centers will be the organizations most affected by the power crisis. “Enterprises that have already sourced or have moved to the cloud will be less impacted, although they will not escape some cost challenges,” Hall says. “Energy costs are going up across the board, so you can expect those costs to be passed on to customers through existing agreements.” With energy becoming increasingly scarce and expensive, many European enterprises are turning to hyper-scalers, an agile method of processing data via remote data centers that are equipped with horizontally linked servers. Hall forecasts a bigger push to cloud computing in the months ahead, especially toward major hyper-scaler providers -- including Amazon AWS, Microsoft Azure, Google GCP, Alibaba Cloud, IBM, and Oracle -- which tend to offer both lower costs and reduced carbon emissions. “Given the complexity of transitioning workloads, though, we are concerned clients will pull back on technology spend for lower-priority activities,” he says.


A “Green” Quantum Sensor

Zhu and colleagues take a different approach by developing a quantum sensor that generates its own power from a renewable energy source, in this case solar energy. The team’s sensor is made from an ensemble of NV centers in diamond, a well-established solid-state quantum-sensing platform that can operate over a wide range of temperatures (0–600 K), pressures (up to 40 GPa), and magnetic fields (0–12 T). Nitrogen-vacancy centers are defects that are typically created by implanting nitrogen ions into a diamond lattice. The centers confine charge carriers—such as electrons or holes—creating a localized electronic state. Users can read out the spin of this state by exciting the defect with a laser. The NV center then emits radiation, via fluorescence, whose intensity correlates with the system’s spin. Researchers typically use a green laser for this excitation, as that color of light produces the strongest fluorescence in the system (the emitted radiation is red). For use in quantum applications, NV centers are ideal because they operate at room temperature, so no cooling apparatus is required.


How to Achieve API Governance

To scale your developer (i.e., user) experience, you need to look at the bigger picture “the entire API landscape” and not just a single API. Not only will you have different consumers accessing one API, but they will mix and match different APIs to build their own experiences. These experiences will be built by the designers on the consumer side; they will combine the APIs in a way that makes sense for their users. The main question to ask here is “How do you plan for such a scenario?” This is where API governance and scaling come into play, because now we are not only looking at what is the best way to manage that one API but we are looking at the best way to manage the entire API landscape, so that we can scale those experiences much better. The more we design and optimize the experience for the entire API landscape, the more we can create a better UX that translates to more value generated. One crucial thing to keep in mind is that, in the end, even though we are striving to get the best UX, that’s not all that matters.


7 critical steps to defend the healthcare sector against cyber threats

Internet of Things (IoT)-enabled equipment has been hugely beneficial in enabling healthcare providers to automate and facilitate remote working. But if not properly monitored and patched, these connected devices can also provide threat actors with an easy attack path. Hospitals are likely to have hundreds of devices deployed across their facilities, so keeping them all updated and patched can be an extremely resource-heavy task. Many health providers also struggle to accommodate the required downtime to update vital equipment. Automating device discovery and update processes will make it easier to keep devices secured. Providers should also vet future purchases to ensure they have key security functionality and are accessible for maintenance and updates. Healthcare providers sit in the center of extremely large and complex supply networks. Suppliers for medical materials, consultants, hardware, and facilities maintenance are just a few examples, alongside a growing number of digital services.


Gartner reveals top strategic tech trends for CIOs to watch in 2023

Observable data reflects the digitised artifacts — such as logs, traces, API calls, downloads and file transfers — that appear when any stakeholder takes any kind of action. Applied observability feeds these observable artifacts back to users in a highly orchestrated, integrated approach to accelerate organisational decision-making. ... With AI-related privacy breaches and security incidents becoming more frequent, organisations will need to implement new capabilities to ensure model reliability, trustworthiness, security and data protection. AI trust, risk and security management (TRiSM) requires participants from different business units to work together to implement new measures. ... Digital immune systems implement data-driven insight into operations — including automated and extreme testing and software engineering — to increase the resilience and stability of systems. This emerging capability can help provide a roadmap that CIOs can use to plan out new practices and approaches that their teams can adopt to deliver higher business value, while mitigating risk and increasing customer satisfaction.


Digital transformation: 4 paths to becoming future-ready

Modernizing and transforming while recognizing that many legacy systems serve critical business needs that can’t be seriously disrupted isn’t a new tension. It was a common enough theme in 2015 for terms like bi-modal IT (coined by Gartner), 2nd/3rd platform (IDC), and fast/slow IT to be in common circulation. The basic idea was that you might want to modernize (or not) traditional IT while freeing new cloud and container technologies from having to deal with legacy entanglements. A similar approach to digital transformation – with similar pluses and minuses – is in play with this pathway. The authors write that the motivation for this approach is when senior leaders “believe transforming their current firm will take too long and will require a very different culture, skills, and systems than exist today.” They point to similar organizational hurdles that bimodal IT critics pointed to: The cool new organization gets all the attention and focus while the traditional organization slowly trudges along.


How to turbocharge collaboration in innovation ecosystems

Fail fast, learn fast, succeed faster. These precepts, which emanated from Silicon Valley tech labs, have taken the world of open innovation by storm. Executives who are eager to get to the bottom line of evaluating an innovative idea, product, or technology will likely be told, “Wait for the retrospective!” Even in the most traditional, non-digital-native corporations, innovative collaborations have borrowed heavily from methods such as agile technology and lean startup. From ideating and producing a proof of concept, innovation partners will typically work in sprints, testing and prototyping until they have come up with a minimum viable product. In the meantime, successful pilots may lead to other pilots and create spillover effects. Few best practices or approved scripts are available to follow. Instead, it is crucial to test and experiment with different ways of finding solutions to a problem. Although the search for solutions is typically driven by customer needs, optimizing the internal process can also produce results.


Guilty verdict in the Uber breach case makes personal liability real for CISOs

Going forward, CSOs and CISOs may be at odds with their senior and peer groups of executives when a strategic decision is made that places the company at risk, even a mitigated risk. As every CSO/CISO knows, there is no such thing as 100% secure. Has this verdict opened a door for victims of a corporate data breach to not only go after the company with which they had entrusted their information, but also the executives who shoulder that responsibility? Whether this is a welcome turn of events or a shock to the system will play out in the coming months as legal teams of companies that hold personal data evaluate their positions in the light of this verdict. Another question that must be discussed in corporate C-suites is just how far down the executive chain of responsibility should the corporate liability insurance coverage extend and what guidance is coming out of human resources and legal to their executives about personal liability and their need to obtain personal liability insurance.


How to Tackle Cyberthreats in the Metaverse

“This is not a new issue to the metaverse, as people have dealt with conversational integrity since the dawn of social interaction on the internet,” he says. However, with more and more social and workplace interaction taking place in places known as a metaverse, there is a new level of awareness required to ensure you are actually speaking with the individual you think you are speaking with. John Bambenek, principal threat hunter at Netenrich, a security and operations analytics SaaS company, agrees, noting almost all cybersecurity threats start or are furthered by deception of an individual. “Ultimately, I think most crime on the metaverse will surround deception towards individuals,” he says. “Romance scams entail huge financial losses but are almost completely disregarded when companies consider cybersecurity risks.” He explains for most social media companies, ensuring that individuals truly exist (i.e. are not bots) and authentic (i.e. not scammers running 20 accounts) will remain a problem.


New Data Leaks Add to Australia's Data Security Reckoning

In what may be a world-first, the Australian government also pressed Optus to reimburse people for fees incurred related to replacing their passports and driver's licenses. For passports, those eligible must pay for the replacement upfront and then apply for reimbursement from Optus. Optus will apply a credit to customers' bills to cover the cost of replacement driver's licenses, depending on the state or territory. Some states and territories are initially waiving the cost of replacement due to the breach. Optus provides more information here. The government's pressure on Optus to reimburse those affected by the breach is striking and could send a message of increasing intolerance for data breaches and a desire to increase the immediate costs for those responsible for breaches. Consumers often wait years to see any compensation from class action lawsuits as a result of a breach.



Quote for the day:

"Problem-solving leaders have one thing in common: a faith that there's always a better way." -- Gerald M. Weinberg

Daily Tech Digest - October 17, 2022

Get ready for the metaverse

“The metaverse presents an opportunity to more fully transcend our physical limitations,” says Anand Srivatsa, CEO of Tobii. “Technologies like eye tracking will play a critical role in helping reduce the need for compute and networking power, which are required to deliver lifelike, immersive virtual environments. Eye tracking will also help users express their attention and intent in more realistic ways when they’re in the digital universe.” ... If human-digital devices enable the experience, and infrastructure supports metaverse-scale interactivity, then it’s how real the experience feels to users that will be the primary innovation and differentiator. To start, organizations will need strong dataops capabilities, and machine learning models will likely require synthetic data generation. Zuk continues, “Businesses looking to make waves in the metaverse usually begin by establishing a robust data pipeline—with synthetic data as the primary resource driving the development life cycle.” Bart Schouw, chief evangelist at Software AG, agrees.


Cybercriminals are having it easy with phishing-as-a-service

Phishing-as-a-service is a fairly new phenomenon, this trend is where the cybercriminal actually takes the role of a service provider, carrying out attacks for others instead of just for themselves in exchange for a sum of money. PaaS only serves to show how hackers are becoming better organized and looking for greater monetisation from ransomware. Instead of threat actors being required to have technical knowledge of building or taking over infrastructure to host a phishing kit (login page emulating known login interfaces like Facebook/Amazon/Netflix/OWA), the barrier to entry is significantly lowered with the introduction of PaaS. ... Phishing-as-a-service can be very advanced, with capabilities spanning from detecting sandbox environments, to fingerprinting user agents in order to determine whether you might be a researchers bot. That being said, Web Content Filters can often limit the exposure of users.


Top 5 Data Science Trends That Will Dominate 2023

Automation plays a significant role in transforming the world. It has stimulated various transformations in business, resulting in sustained proficiency. In the past few years, the best automation capabilities have been provided by the industrialisation of big data analytics. The process of Analytic Process Automation (APA) encourages growth by providing prescriptive and predictive abilities along with other insights to businesses. Through this, businesses have been able to receive excellence with efficient results and low costs. Analytic Process Automation mainly enhances computing power to make good right decisions. Data analytics automation can be considered a perfect disruptive force. Big data analysis helps substantially with stimulating valuable data usage and productivity. ... Data Governance handles data access all over the world. General Data Protection Regulation (GDPR) compliance has various organizations and businesses that prioritize data governance and handles the data of consumers.


Code Red: the Business Impact of Code Quality

The main problem with technical debt is that code lacks visibility. Code is an abstract concept that isn’t accessible to all members of your organization. Hence, it’s easy to ignore technical debt even if we are aware of the general problem. Quantifying and visualizing the situation in your codebase is key, both for the engineering teams as well as for product and management. Visualisations are wonderful as they let us tap into the most powerful pattern detector that we have in the known universe: the human brain. I explored the concept at depth in Your Code as a Crime Scene, and founded CodeScene back in 2015 to make the techniques available to a general audience. ... With code health and hotspots covered, we have everything we need for taking it full circle. Without a quantifiable business impact, it’s hard to make the case for investing in technical debt paydowns. Any measures we use risk being dismissed as vanity metrics while the code continues to deteriorate. We don’t want that to happen.
Those at the cutting edge of ML are increasingly turning to synthetic data to circumvent the numerous constraints of original or real-world data. For instance, company Synthesis AI offers a cloud-based generation platform that delivers millions of perfectly labeled and diverse images of artificial people. Synthesis AI has been able to accomplish many challenges that come with the messy reality of original data. For a start, the company makes the data cheaper. ... The challenges of real-world data don’t end there. In some fields, huge historical bias pollutes data sets. This is how we end up with global tech behemoths running into hot water because their algorithms don’t recognize black faces properly. Even now, with ML technology experts acutely aware of the bias issue, it can be challenging to collate a real-world dataset entirely free of bias. Even if a real-world dataset can account for all of the above challenges, which in reality is hard to imagine, data models need to be improved and tweaked constantly to stay unbiased and avoid degradation over time. That means a constant need for fresh data.


Improve Developer Experience to Prevent Burnout

It’s obvious that a poor developer experience creates a negative impact throughout an entire company. If developers aren’t producing good work due to unhappiness, illness or burnout, it’s likely that organizations aren’t staying at the cutting edge or offering competitive products in the market. A demoralized team can have a really negative business impact, and it can even change the way that people outside the company feel about it. An unhappy team isn’t going to lead to much creativity or productivity. As a way to combat this growing trend, companies are looking left and right for solutions. Some companies are reaching for things like extra PTO days, a full month off, better benefits, pay raises, and more fun work culture or relaxed dress codes. Those things are nice to have, and we’re certainly not speaking ill of any organization trying something new to help their employees. But at the end of the day, if the overwork and unrealistic expectations remain, the developer burnout will remain too.


Top skill-building resources and advice for CISOs

Ultimately, the hiring organisations will define what it needs in terms of cybersecurity to find the right person. In finance and insurance, for example, there will be specific rules that must be followed in different countries and cybersecurity leaders in such organisations may even be liable. In telecommunications, the skills required are likely to be more technical, whereas in government knowledge around governance and risk are top of the list. “For instance, a smaller organisation which is a greenfield site, or a large multinational where there is already an established security function require different sets of skills and approaches,” Joseph Head, director technical security at Intaso tells CSO. “There are a few commonalities between all CISO roles, however: an understanding of risk and risk appetite — in other words, an understanding of the business, and how much risk it can carry. This dictates how much work a CISO must do, and therefore available budget. Unlocking that budget can only be done by communicating effectively.”


Startup promises SD-WAN service with MPLS reliability, less complexity

Graphiant says that what makes its service different from SD-WAN offerings is how its Stateless Graphiant Core handles WAN data and control planes. The company says many large enterprises have been unwilling to give up the SLAs that come with MPLS for mission-critical traffic. Thus, SD-WAN augments the MPLS network for lower-priority traffic, and the network team must manage two different networks. The operational and administrative overhead of the combined solution, along with the complexity of overlays, tunnels, and policy management means that many enterprises are turning back to MPLS providers that offer their own SD-WAN or that resell others’. That way, the enterprises can order to relieve the burden of managing a complicated managed service themselves. “Enterprise networks have transitioned from predictable topologies to unpredictable ones,” Raza says. He argues that cloud services, IoT, work from home, and a range of other pressures have pushed the MPLS-plus-SD-WAN formula to its breaking point.


High-trust workplace meets no-trust network security

Clearly, the traditional model for IT security is no longer fit for this newly-dispersed world of work and a fresh model is needed — one where the unit of control is identity and where identity is the basis of a system of authorisation and authentication for every device, service and user on your network. Welcome to zero trust, a system which works on the assumption that identity needs to be authenticated and authorised. Given the shift to high-trust digital working environments and the surge in attacks, interest in zero trust is growing. According to Gartner, 40 percent of remote access will be conducted using a zero trust model by 2024 — up from five percent in 2020. Remote work is driving uptake, with zero trust seen as a fast way to achieve security and compliance, according to a Microsoft report on its adoption. Zero trust is implemented through consistent tools, workflows and processes delivered as a set of shared, centrally-managed and automated services. What does this look like? It means codifying policies and procedures for authorisation and access across the technology stacks, domains and service providers that comprise the IT infrastructure.


IT leadership: How to defeat burnout

What sets Liberty Mutual apart from other organizations is our purpose. We exist to help people embrace today and confidently pursue tomorrow. This is our North Star and helps define and guide everything we do. We also understand that combating burnout requires connecting work to outcome. To ensure that this happens, we spend time defining targeted outcomes – the realization of the expected benefit – versus output – for example, simply turning on a new feature in a system. Success is measured by producing results and realizing benefits. Outcome might be the ability to deploy capabilities faster than before, for example. The key word is ‘capabilities,’ which help us deliver better products and services to customers. An outcome is much bigger than an output such as simply turning on a technology. These nuances matter in the context of burnout. If you’re working on a project and you don’t know why you’re doing it or what the intended results are, you’re not connected to why it matters.



Quote for the day:

"Brilliant strategy is the best route to desirable ends with available means." -- Max McKeown

Daily Tech Digest - October 16, 2022

Top AI investors reveal State of AI in 2022

What’s new in 2022, and what made Benaich and Hogarth dedicate an entire section to AI safety, is the other end of AI safety. This is what Hogarth referred to as AI alignment: ensuring that an extremely powerful and superintelligent AI system doesn’t ever go rogue and start treating humanity badly in aggregate. The 2022 State of AI report is very much biased toward that end of safety because, according to Hogarth, the topic is not receiving enough attention. “We’re seeing exponential gain in capabilities, exponential use of compute, exponential data being fed into these [AI] models,” Hogarth said. “And yet we have no idea how to solve the alignment problem yet.” It’s still an unsolved technical problem where there are no clear solutions, he added: “That’s what alarms me — and I think that the thing that is probably the most alarming about all of it is that the feedback loops now are so violent. You have huge wealth creation happening in AI. So there’s more and more money flowing into making these models more powerful.”


Interview with Vinayak Godse, CEO of Data Security Council of India

We see ourselves as an important catalyst in the National Cybersecurity initiatives, especially in terms of the technology geopolitics that is now hitting up; take the US–China tech war, for example. Cyber security in today’s age and day has become pivotal because the coming decade is going to be driven by technology, and cyber security is one fundamental area which will be driving all of these transitions. As per NASSCOM’s TECHADE 2020: Digital Tech Opportunities report, AI/ML, cloud, and cybersecurity will be crucial and critical for this decade. But, how will that happen? There are many different parts to this. Firstly, security should enable the growth of the industry. We aim to prepare the industry, the society, the individuals and, most importantly, the economy against possible issues and challenges regarding privacy. This is the second part. With technologies such as AI/ML, Data Analytics, and VR/AR gaining prominence, we will work towards solving the security problem in relation to these emerging technologies.


How Can Business DataOps Drive Growth?

DataOps is a fast-expanding area of expertise. Data analytics and operations specialists eager to learn how to develop and oversee DataOps procedures will have a successful future. They have the opportunity to guide the following group of data teams and set the bar for data practices for at least the ensuing ten years. Additionally, a creative, quickly expanding organization that reduces laborious and repetitive business activities will have happier and more motivated employees. The time it takes to develop a concept into something valuable is crucial to businesses. Through the use of agile development methodologies, DataOps shortens lead times. Additionally, the interval between rounds is shortened. Additionally, producing and dispersing solutions in tiny pieces enables solutions to be applied gradually. Shadow IT may form in businesses that use a sluggish development strategy for data solutions. Other departments create their concepts without the IT department’s approval or involvement.


Message Routing and Topics: A Thought Shift

There is one thing that caught my attention and fascination simultaneously. The central theme of a real-time enterprise integration is message routing. Almost all messaging systems employ an intelligent, interest-based subscription mechanism that ensures the routing of messages to interested parties. The intelligent part of routing is built with rules around the Event type (name) and the content. Routing decisions based on an Event type name are straightforward because they are easily accessible and available without unpacking the payload. However, a routing based on filtering rules on the content would require unpacking and evaluating the rules to determine a match. This directly impacts message throughput and performance. Content-based message routing (CBR) is performance-penalizing and not the best choice for real-time messaging scenarios where latency costs good decisions. In the early days of integration, content-based routing was considered essential. 


Where Quantum Entanglement Is Actually Being Used

Quantum entanglement is a critical element of quantum information processing, and photonic entanglement of the type pioneered by the Nobel laureates is crucial for transmitting quantum information. Quantum entanglement can be used to build large-scale quantum communications networks. On a path toward long-distance quantum networks, Jian-Wei Pan, one of Zeilinger’s former students, and colleagues demonstrated entanglement distribution to two locations separated by 1,203 km on Earth via satellite transmission. However, direct transmission rates of quantum information are limited due to loss, meaning too many photons get absorbed by matter in transit so not enough reach the destination. Entanglement is critical for solving this roadblock, through the nascent technology of quantum repeaters. An important milestone for early quantum repeaters, called entanglement swapping, was demonstrated by Zeilinger and colleagues in 1998. Entanglement swapping links one each of two pairs of entangled photons, thereby entangling the two initially independent photons, which can be far apart from each other.


Virtual Panel: The New US-EU Data Privacy Framework

Given the Court of Justice of the European Union's (CJEU) stance regarding US surveillance law, it is not clear how GDPR can be made compatible with transatlantic data transmission. Thus it is likely that any new privacy frameworks will be challenged in courts. Yet, the newly proposed Trans-Atlantic Data Privacy Framework brings an attempt to solve the underlying issues and may include an independent Data Protection Review Court as a mechanism to solve disputes that could provide an effective solution. If the new framework did not pass European Courts' scrutiny, it is possible that a completely different approach to data privacy will be required in future to ensure data transmission and collaboration while granting privacy rights, such as treating user data as a currency or similarly to copyright. In this virtual panel, three knowledgeable experts in the field of data privacy discuss where the existing agreements fall short, whether a new privacy agreement could improve transatlantic data sharing while granting privacy rights for EU citizens and stronger oversight of US intelligence, and more.


Distributed Ledger Technology (DLT): The Solution to the Age of Digital Distrust?

DLT has more going for it than blockchain technology alone. Alternatives are available across the DLT spectrum that already solve the so-called blockchain trilemma: guaranteeing high security, scalability, and decentralization. While a blockchain works with a linked list of blocks, IOTA, for example, uses the ‘Tangle,’ an acyclic fabric of mutually linked transactions that maintain the global shared state of the ledger while boosting speed. In addition, IOTA also avoids classic transaction fees, which would be prohibitive for applications in the IoT area, through an alternative consensus algorithm based on the reputation of the nodes. ... What makes DLT so exciting and relevant is that it was conceived and developed for this decentralized digital world where trust is at a premium. It’s not simply a case of storing the information safely that creates trust. It’s also how it’s created and continuously ensured between all the different partners of a business process. DLT determine the conditions under which nodes of the decentralized infrastructure capture and record new transactions and when they do not.


Digital innovation and the future of financial services

The growth of Web3 technologies also offers new opportunities, as the internet evolves from read-only pages to more interactive and immersive experiences. “Web1 is traditional data. Web2 added social [interaction] as data. And in Web3, everything is data,” said Dr Booth. In Web3, that data has real value to its users, and underlying blockchain technology allows it to be transferred and monetised easily. The most recognisable example may be buying digital ‘land’ in the metaverse. Transactions that take weeks in the real world can be completed in seconds and recorded on a secure, immutable blockchain. “How we connect the metaverse to the real-verse is going to be where banks will play a role,” said Mr Williamson. The digitalisation of financial services will create virtual mountains of new and complex data, generated from disparate sources and stored in different locations. Yet when everything's digital, that kind of volume and complexity becomes manageable. From Dr Booth’s point of view, AI is the connective digital tissue holding everything together.


How to Prove the ROI of Your Enterprise Architecture Efforts

The ROI of EA can be felt at the highest levels of the organization, but it can also have an impact at the individual department level. Cultivating this impact involves both research into specific challenges different departments face and educating department heads on what EA can do. For example, legal teams may not know that enterprise architecture has a critical role to play when it comes to navigating compliance standards and regulations. Similarly, marketing and sales teams may not realize how EA can support data management to drive analytics and personalization efforts. Every corporate function today depends on technology to be effective. EA is all about better, more strategic uses of technology. It thus falls to EAs to evangelize their capabilities across the enterprise, seeking out often unexpected opportunities to improve operations and outcomes, department by department. Getting users from across your organization on your EAM tool is no small undertaking. 


How to turbocharge collaboration in innovation ecosystems

Handled in the right way, collaborations will align and connect potential co-innovators to a shared purpose—internally as well as externally. Importantly, value creation is no longer just a numbers game. Echoing the motto of former PepsiCo CEO Indra Nooyi, “performance with purpose,” impactful innovation in an ecosystem is likewise driven by social values as much as by numbers. These are reflected in the challenges and specific problems the collaborators seek to address; in reframing the purpose, if necessary, so that it fits everyone’s objectives; and in the way the collaboration defines success. Much as in a team sport, egos and titles are swept aside in pursuit of a greater goal. ... Disruption, the digital revolution, covid-19—in aggregate, these factors have blurred and, in some cases, dissolved the boundaries between organizations, segments, and entire industries. As a result, innovation ecosystems are emerging as the dominant paradigm for corporate innovation. Yet, because of the fundamental disparities embedded in their structures, ecosystems are difficult to form and initiate, let alone sustain.



Quote for the day:

"Be a Strong Leader, Even If You Follow a Weak Leader" -- Miles Anthony Smith

Daily Tech Digest - October 15, 2022

Australia becoming hotbed for cyber attacks

“Cyber criminals are targeting the personal data of Australians for financial gain – to sell, to hold to ransom, or to commit financial fraud and scams,” said Reinhart Hansen, director of technology at Imperva’s CTO office. “During the pandemic, many organisations inadvertently created more opportunities for these bad actors. Many rushed their online implementations and transformation projects, taking shortcuts that left them vulnerable to exploitation. “Now we’re seeing a large uptick in common, off-the-shelf and automated type attacks that hackers are continuously recycling and using against Australian targets,” he added. Hansen noted that threat actors have also been looking for known weaknesses and vulnerabilities in applications and application programming interfaces (APIs) to gain access to the data repositories that sit behind them. “Their ultimate aim is to exfiltrate data at scale that will allow them to build citizen profiles that are used as the basis of their illegal activity,” he said. The most heavily targeted industries in Australia were financial, retail and business services. 


To Recruit and Retain a Strong Team, Live the Culture You Talk About

As leaders, if we talk about communication and expect people to follow, that communication needs to start with me. We need to be open to it, like lanes of fluid traffic in both directions, not stocked up silos hoarding information for ourselves. I need to communicate what will happen to my team and stay open to taking their feedback. ... We look for similar values when building personal relationships, but this idea is more difficult as a company. I may not expect everyone to share the same politics, but I expect team alignment around certain values. We should all acknowledge the importance of diversity and respect the humanity of each other. We should share a sense of optimism for the company and a desire to contribute to its growth. As leaders, we need to live the story we tell when a great contributor to our team crosses the line. When the behavior is illegal, that decision is a little easier, but determining when behavior crosses an immoral or unethical line can be in the eye of the beholder. However, if something is clearly over the line in my or my employees' minds, I need to take action and be consistent about those opinions.


Waterfall won the war with Agile!

The dream of a decentralised democratic organisation with low atrophy that can swiftly respond to the needs of our markets is not a pipe dream. To do so we need to shift from an Alpha model for business to a Beta model. However, we continuously allow Alpha to prevail. The focus of agile on the decentralised dream failed because we allowed it to. We, and by we I mean agile practitioners, allowed and continue to allow organisations to believe that they don't have to really change, that this is just a team thing, and to keep the departments and the steering committees and yearly budgets. We are complicit in these continued malformed practices. ... We need to reshape our focus from sustaining the Alpha models to keep people complacent and happy to challenging them and actively promoting Beta models and practices that work in those models. We already have the building blocks to do this in the Agile Manifesto, the Scrum Guide, the Kanban Guide, the Nexus Guide, LESS, Scrum@Scale, and many more. What we need to add is the tools that we need to change the organisation; changing teams is easy.


Making product inclusion and equity a core part of tech

I think the world has had a reckoning over the past two years, with many candid conversations kicking off. There’s been a lot of vulnerability and accountability, frankly, around making sure that people have inclusive and equitable experiences across the board in everything they do. When product teams start to think about product inclusion and equity, I talk to them about the “curb cut effect.” The curb cut in sidewalks was originally made in the ‘70s for wheelchair users, but we all use it now, whether it’s people with skateboards, suitcases, or shopping carts. The critical thing to understand is that building for a historically marginalized group results in better outcomes for everyone. There are a lot of examples of that throughout history; another is closed captioning. So even though it feels amplified now, decades of work have helped to ensure that those who have historically not been at the center of development and design can have their voices involved throughout critical points in the process.


8 Reasons Scrum Is Hard to Learn (but Worth It)

The idea of estimating in story points can definitely be a challenge for many team members. I can almost hear them thinking, “I have a hard estimating in days and now I have to estimate in an abstract relative unit I’ve never heard of before?” Story points are a definite challenge, yet they’re worth the effort. As abstract relative estimates of effort, story points enable better conversations about how long work will take. Without story points, a senior programmer and junior programmer have conversations that devolve into, “That’s how long it will take you, but it would take me twice as long.” And then the two pick an estimate that is horrible for one of them or, perhaps even worse, they split the difference. With story points, the senior and junior programmers can consider adding a new feature and both agree it will take twice as long as doing a simpler feature. They then give the bigger item an estimate twice that of the simpler item. Estimating in this relative manner allows developers to agree even if they would never be able to agree on how many hours or days something would take. 


Design Thinking Improves Your Data Science

As data scientists, our first instinct is to begin to understand the data we are going to use to solve our problems. However, we need to understand what is beyond the data to the people involved in this problem. We can have all of the data in the world but if we do not know how users or stakeholders interact with the product and understand it in their terms, we cannot possibly make a solution that is going to fully solve their problem. ... My favorite way to approach generating problem statements is to use HMW statements. In this process, everyone writes down problem statements starting with the phrase “How might we…”. They are usually generated individually and voted on by the group to obtain the best problem statement. HMW statements are written positively to make sure how we remember how the user should feel. ... Now that you have our problem statement, you will need to think about the different ways the problem can be solved. In this step, any idea is a good idea, focusing on quantity over quality.


JIT vs. AOT: How to Pick the Right Approach

What a just-in-time compiler can't do is compile ahead-of-time. What an ahead-of-time compiler does is it takes all the code and it compiles it to your binary before you ever run the program. It could do all that and avoid all the later work of doing this. What ahead-of-time compiler can't do is compile just-in-time. The annoying thing is the choice. If you have to choose between them, I am definitely on the just-in-time side. I've got some strong arguments for why, because you just get faster code, period. It's provable. The real question is, why do we have to choose? ... It's absolutely true that with ahead-of-time compilation, people feel like they can afford to throw a lot more analysis power at the optimizations, and therefore lots of times people will say, this analysis we can do ahead-of-time. In reality, anything an ahead-of-time compiler can do a just-in-time compiler can do. It's just a question of, can you afford doing? Do you want to spend the time while you're running the program to also do that? That's one direction.


Chaos theory eliminates quantum uncertainty

The most important reason stems from a quantum phenomenon that Schrödinger himself named entanglement. Specifically, two particles can be emitted from a source, such that the properties of the two particles – e.g., their angular momenta (also known as spins) are correlated. This itself is not necessarily strange. However, the Northern Irish physicist John Bell showed that, under seemingly reasonable assumptions, these correlations, suitably combined, are limited in size. This is called Bell’s theorem. The 2022 Nobel Physics Prize was given to three physicists (Alain Aspect, John Clauser and Anton Zeilinger) who showed that in practice, the combined correlations can exceed this limit. Hence one or more of these seemingly reasonable assumptions must be wrong. The standard interpretation of this experimental result is that it confirms that quantum uncertainty is ontological, not epistemological. That is, uncertainty is a feature of reality itself, not a reflection of the limits of our knowledge. Of course, this is such a startling conclusion that physicists have looked for other ways to explain Bell’s theorem.


Get used to cloud vendor lock-in

Granted, now the game is a bit different with higher stakes. Many cloud providers offer the same operating systems and processor options, the same databases, and even the same ops and security tools. So, why is vendor lock-in still a trade-off? As an aside, if you just announced that you’re off to build systems that completely avoid vendor lock-in, I will wish you good luck. However, unless you want consistently crappy applications, you’ll have to leverage native security, native infrastructure as code, serverless systems, etc., that are usually supplied by different providers as native services, which is why you’re on a public cloud in the first place. If we move to the most feature-rich public cloud platforms, it’s to take advantage of their native features. If you use their native features, you lock yourself in to that cloud provider—or even lock yourself in to a subplatform on that cloud provider. Until there are alternatives, you better get used to lock-in. 


Understanding the Four Domains of Enterprise Architecture

The technology architecture domain encompasses all infrastructure and enterprise uses to support the goals and execution of the business, information and application processes. It covers all logical hardware and software apps, including front-end systems, back-end infrastructures, cloud and on-site platform technologies, IoT, networks and communications. To demonstrate the difference between application and technology architectures, let’s consider an enterprise in the e-commerce industry. The e-commerce app falls under the technology architecture domain because it generates the data for the business — the number of visitors per day or sales per day. An analytics tool like Tableau, which helps translate the data generated into a comprehensible form and distributes it to where it’s needed, is under the application architecture domain. An enterprise architect in this domain would define the requirements of the hardware and software infrastructure needed to power the resources in the application and data architectures that enable and optimize business processes. 



Quote for the day:

"Always be yourself, express yourself, have faith in yourself, do not go out and look for a successful personality and duplicate it." -- Bruce Lee

Daily Tech Digest - October 14, 2022

Which cybersecurity metrics matter most to CISOs today?

Given the rapid increase in malware-free attacks, there’s a tendency on the part of cybersecurity teams to add more metrics. Seeing more reported data as a panacea for rising risks that aren’t immediately understood, cybersecurity teams will turn on as many metrics as possible, looking for clues. Relying on antivirus, SIEM (security information and event management), security ticketing systems, vulnerability scanners, and more, CISOs’ teams generate an overwhelming number of metrics that lack context. CISOs warn that presenting metrics straight from tools without a narrative supporting them is a mistake. C-level executives and the boards they report to are more focused on new insights that are contextually relevant than a series of tactical measures. Every new high-profile intrusion or breach drives up to a dozen or more internal user requests for new metrics. Managing user requests by how much value they provide to contextual intelligence and delivering business value is critical. CISOs tell VentureBeat it’s easy to say no to additional metrics requests when there is no connection to requested metrics that quantify the value cybersecurity delivers.


Making everything connect for smart cities

It’s a vision of how smart cities can be holistically planned by connecting the different city domains and addressing Sustainable Development Goals (SDGs) globally. In this way, mobility, energy, the environment, health, education, security and the economy are not treated separately, but rather as a whole consistent continuity of human-centric services. Smart cities need to be much better at creating an open platform of dialogue that is accessible to all citizens. ... These allow residents to engage with a wide array of data, as well as completing personal tasks like paying bills, finding efficient transportation and assessing energy consumption in the home. Smart cities also need to account for social infrastructure that provides a cultural fabric, making the city attractive to residents and offering a sense of local identity. It is often the social and cultural aspects of a city that citizens find makes it most attractive to live in – aspects such as green open spaces, a wide choice of retail outlets, and bustling nightlife. This is particularly important for cities that are being created ‘from scratch’ (rather than already existing) and need to find effective ways to attract residents.


Dell gets more edge-specific with Project Frontier platform

Dell also said it is expanding its current edge portfolio in the following ways: Edge analytics and operations - Manufacturers can optimize how they deploy edge applications with an Dell Validated Design for Manufacturing Edge, the company said. This now includes new Dell-validated partner applications to support advanced edge use cases, and improve factory processes and efficiencies, while reducing waste and raw materials usage for more sustainable operations. Manufacturers can respond quickly to changes in demand, and enable reconfigurable production lines with Dell's private 5G capability, Dell said. Edge computing and analytics - The PowerEdge XR4000 is the smallest server in the Dell lineup at about the size of a shoebox. The XR4000 is 60% shorter than conventional data center servers, and its multiple mounting options allow it to be installed in a rack, on walls or ceilings, saving valuable floor space. The multi-node, 2U chassis server can survive unpredictable conditions, such as heat waves or falls, the company said.


The White House can build on its AI Bill of Rights blueprint today

Several current uses of AI clearly violate the blueprint and should no longer be used. The president should also stop encouraging agencies to spend American Rescue Plan funds on ShotSpotter and other “gunshot detection” technologies, which change police behavior but have not been shown to decrease gun violence. These tools are in violation of the blueprint’s principles that AI tools must be safe, effective, nondiscriminatory, and transparent. ... On the legislative front, the AI Bill of Rights principles are embodied in both the American Data Privacy Protection Act and the Algorithmic Accountability Act of 2022, both of which the administration could put its support behind. There has been substantial investment in the development and adoption of AI, but nowhere near as much money or energy put toward safeguards or protection. We should not repeat the same self-regulatory mistakes made with social media and online advertising that left us in the privacy crisis we are in today. 


How intelligent automation changes CI/CD

Intelligent automation addresses many of the core requirements for successful software delivery. Basic process automation can increase devops productivity by automating routine manual tasks through code. For example, a developer can run a build in Jenkins that then triggers an automated task that pushes the build to Artifactory and kicks off a delivery pipeline. However, combining automation with AI-powered intelligence can turbocharge processes and improve business outcomes. Intelligent automation can automate routine tasks and then constantly improve automated decision making as the release moves through the delivery lifecycle. Intelligence applied to the release process — when combined with deep tools integrations that provide access not only to events but also to all process data — can automate the detection of software risks and automatically flag release candidates for remediation before they make it to production. In addition to increased devops productivity and faster and more accurate software releases, intelligent automation provides the means to implement centralized, automated control over compliance and security. 


A Big Threat for SMBs: Why Cybersecurity is Everyone’s Responsibility

It impacts everyone across every department and every element of operations. Cybersecurity is a collective responsibility. During this Cybersecurity Awareness Month, let’s debunk the pervasive misconception that cybersecurity is strictly an IT issue. To avoid becoming a statistic, SMBs need to develop a security culture that reinforces the idea that cybersecurity is the responsibility of every team member. From the founder who sets a security-focused tone to the specific teams that implement the policies, to the HR department responsible for onboarding new employees, to the IT team setting system password requirements, and to every employee that can potentially open a phishing email triggering a security incident, it’s a collective effort to stay aware. All individuals need to be trained, vigilant, and engaged. The devil is in the details, as it’s the tools, tasks, and routine activities each team member performs that will protect the company.


Seeing electron movement at fastest speed ever could help unlock next-level quantum computing

Seeing electrons move in increments of one quintillionth of a second could help push processing speeds up to a billion times faster than what is currently possible. In addition, the research offers a “game-changing” tool for the study of many-body physics. “Your current computer’s processor operates in gigahertz, that’s one billionth of a second per operation,” said Mackillo Kira, U-M professor of electrical engineering and computer science, who led the theoretical aspects of the study published in Nature. “In quantum computing, that’s extremely slow because electrons within a computer chip collide trillions of times a second and each collision terminates the quantum computing cycle. ... To see electron movement within two-dimensional quantum materials, researchers typically use short bursts of focused extreme ultraviolet (XUV) light. Those bursts can reveal the activity of electrons attached to an atom’s nucleus. But the large amounts of energy carried in those bursts prevent clear observation of the electrons that travel through semiconductors—as in current computers and in materials under exploration for quantum computers.


New data protection bill must enable a progressive data governance framework

The robust framework that vowed to safeguard the privacy of an individual’s data would have made the privacy design of the bill even more redundant. Consent and notice framework in the new Bill should be dealt with in such a way that it addresses the right to informational privacy while avoiding consent fatigue for consumers. For instance, individuals may receive innumerable privacy notifications causing consent fatigue; this issue was considered and acknowledged by the Justice Srikrishna committee report. Besides, from a business perspective, the cost of compliance, especially for small businesses, will be huge and may result in additional costs. The new personal data governance framework should focus on simplifying the consent and notice framework in such a manner that individuals can easily understand how and for what purpose is their personal data being processed. Besides, the new Bill must lay out better means and ways to obtain consent, which is inclusive, less tiresome, and efficient.


Emotional intelligence: How to create psychological safety for your IT team

The best leaders understand the complexities and imperfections of being human and are not afraid to present their true selves in the workplace. These leaders emanate compassion and encourage their team members to embrace and express their unique gifts and talents. Compassion cuts through mental constructs and perceptions. It begins when leaders examine and undo traditional rules, roles, and narratives that limit their thinking, decision-making, and worldview. Freedom from outdated narratives enables release, self-acceptance, and permission to bring one’s whole self to the workplace. Leaders who are driven by the needs of the ego struggle to let go of outdated competence, values, and skills. Marshall Goldsmith, one of the world’s foremost thought leaders on executive coaching, explains this perfectly in the title of his book, What Got You Here Won’t Get You There. The compulsive need to be right becomes more important than discovering new horizons, untapped potential, and possibilities. Self-righteousness creates a division between the self and the team, eroding trust.


Smart buildings may be your cybersecurity downfall

With the rise of IoT, a wave of adoption of IT and IoT solutions at all levels of building system architecture poses a serious cyber security issue. As it becomes increasingly difficult to distinguish between building automation systems and other systems used in companies and their infrastructures, more “cyber holes” tend to be left unmonitored. The use of insecure industrial protocols is another vulnerability that attackers take advantage of to disrupt smart buildings operations. This is especially the case for building automation systems. Popular protocols like BACnet and LonWorks are not implicitly secure and, like those used in the industrial production sector, tend to have their own vulnerabilities. ... As the cyber-physical equipment within buildings becomes increasingly distributed, especially due to the new trend of supervising building complexes from a central location, cyberattacks on smart buildings, as well as attacks on cities and other smart city infrastructures, can have a significant security impact for users.



Quote for the day:

"Personal leadership is the process of keeping your vision and values before you and aligning your life to be congruent with them." -- Stephen R. Covey

Daily Tech Digest - October 13, 2022

Information overload, burnout, talent retention impacting SOC performance

John Lodge, SOC Manager at Socura, says alert fatigue is a particular problem. “As well as causing fatigue for the analysts, repeating false positives also draws attention from and potentially delays responses to real active threats,” he tells CSO. The main solution to this is with effective tuning, he adds. “Key challenges to overcoming this are getting investment from analysts to ensure tuning opportunities are exploited as soon as possible. In cases where tuning is not possible, automation should be used so as much manual work is taken off the analyst as possible. Again, the challenge here is making sure the initial effort is put in to automate these actions before the false positives build up.” First-time fix challenges are also significant, Lodge says. “When escalating an incident, we ideally we want to be able to have resolved the incident with the tools and information at our disposal. In some cases, this is not possible as further context is required.” The challenge is to ensure that, in all cases, we have carried out as much investigation and response as possible. 


Cybersecurity’s too important to have a dysfunctional team

Teamwork is an essential part of working in a business and this is just as true of cybersecurity teams. Due to the constant vigilance required, it helps cybersecurity professionals to know they have people around them, with whom they can share the workload. There are five fundamental qualities that make every team great: communication, trust, collective responsibility, caring and pride. Everyone individually is important, but it is in coming together that they become unbeatable. Effective teamwork begins and ends with communication. It does not always occur naturally, but it must be taught and practised in order to bring everyone together as one. Along with a strong and functional team, a good leader is essential. Cybersecurity teams have stressful jobs, with the whole company looking to them in times of crisis (which can be heighted during economic instability). For this, the cybersecurity team requires a capable leader under pressure to help engender trust across their staff. They must also be able to advocate for the team if some are burnt out or require further training.


12 things every CIO must get done in year one

Dr. George F. Claffey Jr., CIO and interim vice president of Institutional Advancement and Strategic Partnerships at Central Connecticut State University, says he, too, focuses on listening and building relationships. He sees building trust as an essential extension of that work. “No one is going to have confidence in your agenda if you can’t be trusted,” he says. To build trust, Claffey acknowledges others’ challenges and works to fix them. “We find the win for them,” he says, adding that he also attends meetings held by other departments and demonstrates a genuine interest in their goals so they see “I’m interested in not just IT but everything that’s happening.” ... To have a successful CIO shop from the start, Jim Hall, CEO of consultancy Hallmentum, says CIOs “need to have the right people doing the right things at the right time, and they have to have the right skills.” To ensure they have that, CIOs should assess their teams early on to identify skill gaps in individuals and across teams, and then determine what measures are needed to get in place the right people and skills doing the right thing at the right time.


5 Factors to Weigh When Building Authorization Architecture

As you succeed, you will inevitably want to start winning some larger enterprise customers. This means working with a whole new set of authorization challenges. When working with a transnational business — and, specifically, working with a single department of it at first — reflecting the organizational structure in your permissions quickly becomes a sticking point. New requirements emerge, such as:Users should only be able to access resources and data within their department and geography. Managers in the London office should be able to access everything in the other U.K. offices. Heads of departments globally should be able to do everything in all regions. Company vice presidents want to see everything but don’t really know how things work, so they should only have view access. Sally in the Paris office is our superstar employee who also does work now and again for the U.S. team, so she should have access to their account every 3rd Tuesday if it is a full moon (might have exaggerated a bit here).


9 out of 10 banks still use mainframes. Google Cloud wants to reduce that.

Google Cloud plans to introduce what it’s calling a simpler, more risk-averse way for enterprises to move their legacy mainframe estates to its cloud with a new service built on technology originally developed by Banco Santander. That service is Dual Run, and it enables parallel processing, allowing enterprises to make digital copies of their legacy mainframe systems and run them simultaneously on Google Cloud Platform. The service addresses a big challenge with mainframes: the tight coupling of data to the application layer. It allows real-time testing by customers to ensure their cloud workloads are performing as expected, running securely, and meeting regulatory compliance needs — without stopping an application or negatively impacting their end-user experiences — before transitioning to GCP as their primary system. “This is a simple concept, but hard to implement — hasn't been done so far,” Nirav Mehta, Google Cloud’s senior director of product management for cloud infrastructure solutions and growth, told Protocol.


The Microsoft-Cisco Teams collaboration could create an interoperability revolution

Cisco is a telecom company. It should get, even better than Microsoft, why things in its space need to interoperate and how to differentiate on features, capabilities, and price without locking out competing solutions. Embracing Teams doesn’t mean it's abandoning Webex, but Webex will need to find a path to third-party hardware or it will lose ground against options like Teams. Cisco seems to get this, based on the statement yesterday from Jeetu Patel, executive vice president and general manager, security and collaboration at Cisco: "Interoperability has always been at the forefront of our hybrid work strategy, understanding that customers want collaboration to happen on their terms — regardless of device or meeting platform,” said Patel. “Our partnership with Microsoft brings together two collaboration leaders to completely reimagine the hybrid work experience.” This should help drive Cisco toward a future where Webex and Teams could interoperate, as well, which might mean the end of products like Zoom.


The future of low-code governance with Managed Environments for Power Platform

While the low-code concept has been around for decades, there has been an evolution in governance capabilities and offerings. In the initial era of low-code development, employees would build solutions in Microsoft Excel, Microsoft Access, Microsoft InfoPath, and other tools, often hidden from their IT departments and all governance strategies or policies. As technology matured, many of these solutions became cloud-based and were built using a variety of standalone providers. As each provider offered different governance capabilities, organizations needed to face the challenge of overwhelming low-code solution development and the expertise, specialized resources, and additional time to train the admins on the governance tools that must be implemented. Microsoft Power Platform allows organizations to move to the next era of low-code governance with a central low-code platform that allows makers to enjoy shared components and experiences. 


Distributed cloud computing and its rising importance for businesses

Distributed systems mitigate these drawbacks in cloud computing by adequately allocating the workload to pre-selected servers, typically the ones physically nearest to clients. Soon, augmented reality, virtual reality, and the Internet of Things (IoT) will all be used with data networks. They typically have low latency and primarily benefit from the distributed cloud’s CDN/caching feature. Distributed cloud computing reduces the overall resource requirements for the central infrastructure and speeds up latency for users that need it by keeping certain computations, data delivery, and storage local. It increases the overall solution’s effectiveness for all users. Distributed cloud benefits with cloud computing distributed systems include being open for purchase, allowing users to request that some data remain within particular zones or that a specific performance target for latency or throughput is fulfilled. These are discussed between the client and the cloud provider in Service Level Agreements (SLA).


CIOs sharpen their mainframe exit strategies

Spangler advises IT leaders to “take an economic view” of what to migrate given that there are still “tremendous technology capabilities” that exist on the mainframe. “It can’t be a theoretically thing,’’ he says. “We just know for our environment, because we’re more than a 40-year-old company … we have old technologies we were replacing anyway, and when we looked at our enterprise strategy, it just made sense.” Spangler says IT leaders should also keep the principles of engineering and architecture in mind. “A lot of people are so focused on getting rid of their mainframes they end up with mess,” he says, adding that strong engineering and architecting upfront will help make sure you end up with something that is modern, world-class, expandable, secure, and modifiable. Lastly, Spangler recommends that IT leaders “continuously update your plan because it’s a battle. It’s hard. Brutally hard. We literally zero-base our business case on this every quarter and build from the bottom up.’’


CISO: A day in the life

While engineering and technical disciplines are at the core of our profession, we must effectively communicate with executives and boards of directors to keep our companies, customers, and partners safe. We must communicate the latest threats and regulations in the business context. Understanding potential business risks are essential to prioritizing cybersecurity – and all – risks accordingly. During my time as a cybersecurity consultant for a food company, I highlighted the risk associated with credit card theft. One executive asked how that compared to the risk the company faced if it experienced a salmonella outbreak and a customer died of food poisoning. At the time, I had no good answer to this question. This example shines a light on our role as business enablers. Cybersecurity professionals are tasked with enabling our colleagues to pursue opportunities and innovation. As guardians and protectors of our business, we perform best by embracing that ethos within business operations, with an eye always focused on risk management.



Quote for the day:

"I think the greater responsibility, in terms of morality, is where leadership begins." -- Norman Lear