Daily Tech Digest - August 31, 2022

Beyond “Agree to Disagree”: Why Leaders Need to Foster a Culture of Productive Disagreement and Debate

The business imperative of nurturing a culture of productive disagreement is clear. The good news is that senior leaders can play a highly influential role in this regard. By integrating the concepts of openness and healthy debate into their own and their organization’s language they can institutionalize new norms. Their actions can help to further reset the rules of engagement by serving as a model for employees to follow. ... Leaders should incorporate the concept of productive debate into corporate value statements and the way they address colleagues, employees, and shareholders. Michelin, for example, built debate into its value statement. One of its organizational values is “respect for facts,” which it describes as follows: “We utilize facts to learn, honestly challenge our beliefs….” Another company that espouses debate as value is Bridgewater. Founder Ray Dalio ingrained principles and subprinciples such as “be radically open-minded” and “appreciate the art of thoughtful disagreement” in the investment management company’s culture.


Using technology to power the future of banking

Because I believe that anyone that wants to be a CIO or a CTO, particularly in the way that the industry is progressing, you need to understand technology. So, staying close to the technology and curious and wanting to solve those problems has helped me. But there's another part to it, too. In every one of my roles, there have been times when I've seen something that wasn't necessarily working and I had ideas and wanted to help, but it might’ve been outside of my responsibility. I've always leaned in to help, even though I knew that it was going to help someone else in the organization, because it was the right thing to do and it helped the company, it helped other people. So, it ended up building stronger relationships, but also building my skillset. I think that's been a part of my rise too, and it's something that's just incredibly powerful from a cultural perspective. That’s something that I love here. Everybody is in it together to work that way. But I also think that it just speaks volumes about an individual, and people gravitate to want to work with people that operate that way. 


Physics breakthrough could lead to new, more efficient quantum computers

According to the researchers, this technique for generating stable qubits could have massive implications for the entire field of quantum computing, but especially for scalability and noise-reduction: At this stage, our system faces mostly technical limitations, such as optical losses, finite cooperativity and imperfect Raman pulses. Even modest improvements in these respects would put us within reach of loss and fault tolerance thresholds for quantum error correction. It’ll take some time to see how well this experimental generation of qubits translates into an actual computation device, but there’s plenty of reason to be optimistic. There are numerous different methods by which qubits can be made, and each lends to its own unique machine architecture. The upside here is that the scientists were able to generate their results with a single atom. This indicates that the technique would be useful outside of computing. If, for example, it could be developed into a two-atom system, it could lead to a novel method for secure quantum communication.


Organizations security: Highlighting the importance of compliant data

When choosing a web data collection platform or network, it’s important that security professionals use a compliance-driven service provider to safeguard the integrity of their network and operations. Compliant data collection networks ensure that security operators have a safe and suitable environment in which to perform their work without being compromised by potential bad actors using the same network or proxy infrastructure. These data providers institute extensive and multifaceted compliance processes that include a number of internal as well as external procedures and safeguards, such as manual reviews and third-party audits, to identify non-compliant active patterns and ensure that all use of the network follows the overall compliance guidelines. This of course also includes abiding by the data gathering guidelines established by international regulators, such as the European Union and the US State of California, as well as enforcing others who follow public web scraping best practices for compliant and reliable web data scraping or collection.


TensorFlow, PyTorch, and JAX: Choosing a deep learning framework

It’s not like TensorFlow has stood still for all that time. TensorFlow 1.x was all about building static graphs in a very un-Python manner, but with the TensorFlow 2.x line, you can also build models using the “eager” mode for immediate evaluation of operations, making things feel a lot more like PyTorch. At the high level, TensorFlow gives you Keras for easier development, and at the low-level, it gives you the XLA optimizing compiler for speed. XLA works wonders for increasing performance on GPUs, and it’s the primary method of tapping the power of Google’s TPUs (Tensor Processing Units), which deliver unparalleled performance for training models at massive scales. Then there are all the things that TensorFlow has been doing well for years. Do you need to serve models in a well-defined and repeatable manner on a mature platform? TensorFlow Serving is there for you. Do you need to retarget your model deployments for the web, or for low-power compute such as smartphones, or for resource-constrained devices like IoT things? TensorFlow.js and TensorFlow Lite are both very mature at this point. 


IoT Will Power Itself – Power Electronics News

Energy harvesting is nothing new, with solar power being one of the most famous examples. Solar energy works well for powering parking meters, but if we’re going to bring online the packaging and containers that are at the heart of our supply chains—things that are indoors and stacked on top of each other—we need another solution. The technology that gives mundane things like transporting cash registers both their intelligence and energy-harvesting power are small, inexpensive, brand-size computers printed as stickers and affixed to cash registers, sweater tags, vaccine vials, or other items racing in the global supply chain. These sticker tags, called IoT Pixels, include an ARM processor, a Bluetooth radio, sensors, and a security module — basically a complete system-on-a-chip (SoC). All that remains is to power this tiny SoC in the most efficient and economical way possible. It turns out that as wireless networks permeate our lives and radio frequency (RF) activity is everywhere, the prospect of recycling that RF activity into energy is the most practical and ubiquitous solution.


CoAuthor: Stanford experiments with human-AI collaborative writing

CoAuthor is based on GPT-3, one of the recent large language models from OpenAI, trained on a massive collection of already-written text on the internet. It would be a tall order to think a model based on existing text might be capable of creating something original, but Lee and her collaborators wanted to see how it can nudge writers to deviate from their routines—to go beyond their comfort zone (e.g., vocabularies that they use daily)—to write something that they would not have written otherwise. They also wanted to understand the impact such collaborations have on a writer’s personal sense of accomplishment and ownership. “We want to see if AI can help humans achieve the intangible qualities of great writing,” Lee says. Machines are good at doing search and retrieval and spotting connections. Humans are good at spotting creativity. If you think this article is written well, it is because of the human author, not in spite of it. ... The goal, Lee says, was not to build a system that can make humans write better and faster. Instead, it was to investigate the potential of recent large language models to aid in the writing process and see where they succeed and fail. 


LastPass source code breach – do we still recommend password managers?

The breach itself actually happened two weeks before that, the company said, and involved attackers getting into the system where LastPass keeps the source code of its software. From there, LastPass reported, the attackers “took portions of source code and some proprietary LastPass technical information.” We didn’t write this incident up last week, because there didn’t seem to be a lot that we could add to the LastPass incident report – the crooks rifled through their proprietary source code and intellectual property, but apparently didn’t get at any customer or employee data. In other words, we saw this as a deeply embarrassing PR issue for LastPass itself, given that the whole purpose of the company’s own product is to help customers keep their online accounts to themselves, but not as an incident that directly put customers’ online accounts at risk. However, over the past weekend we’ve had several worried enquiries from readers (and we’ve seen some misleading advice on social media), so we thought we’d look at the main questions that we’ve received so far.


FBI issues alert over cybercriminal exploits targeting DeFi

The FBI observed cybercriminals exploiting vulnerabilities in smart contracts that govern DeFi platforms in order to steal investors’ cryptocurrency. In a specific example, the FBI mentioned cases where hackers used a “signature verification vulnerability” to plunder $321 million from the Wormhole token bridge back in February. It also mentioned a flash loan attack that was used to trigger an exploit in the Solana DeFi protocol Nirvana in July. However, that’s just a drop in a vast ocean. According to an analysis from blockchain security firm CertiK, since the start of the year, over $1.6 billion has been exploited from the DeFi space, surpassing the total amount stolen in 2020 and 2021 combined. While the FBI admitted that “all investment involves some risk,” the agency has recommended that investors research DeFi platforms extensively before use and, when in doubt, seek advice from a licensed financial adviser. The agency said it was also very important that the platform's protocols are sound and to ensure they have had one or more code audits performed by independent auditors.


Privacy and security issues associated with facial recognition software

Facial recognition technology in surveillance has improved dramatically in recent years, meaning it is quite easy to track a person as they move about a city, he said. One of the privacy concerns about the power of such technology is who has access to that information and for what purpose. Ajay Mohan, principal, AI & analytics at Capgemini Americas, agreed with that assessment. “The big issue is that companies already collect a tremendous amount of personal and financial information about us [for profit-driven applications] that basically just follows you around, even if you don’t actively approve or authorize it,” Mohan said. “I can go from here to the grocery store, and then all of a sudden, they have a scan of my face, and they’re able to track it to see where I’m going.” In addition, artificial intelligence (AI) continues to push the capabilities of facial recognition systems in terms of their performance, while from an attacker perspective, there is emerging research leveraging AI to create facial “master keys,” that is, AI generation of a face that matches many different faces, through the use of what’s called Generative Adversarial Network techniques, according to Lewis.



Quote for the day:

"If you don't demonstrate leadership character, your skills and your results will be discounted, if not dismissed." -- Mark Miller

Daily Tech Digest - August 30, 2022

The Great Resignation continues, and companies are finding new ways to tackle the talent shortage

The Great Resignation is far from over. According to a study of 1,000 hiring managers in the US, 60% are struggling to find quality talent needed to fill open roles, with many now turning to freelance workers to bridge the growing skills gap. According to Upwork's most recent Future Workforce Report, 56% of companies that hire freelance workers hired freelancers at an increased rate within the last year. Companies are seeking out skilled independent workers to fill empty positions to compensate for the ongoing loss of talent, particularly in data science, accounting, and IT departments. Many companies are still feeling the burn of the COVID-19 pandemic and its effect on job trends. The ongoing tendency for workers to quit their jobs in search of better opportunities is persistent, and tech workers have proved particularly difficult to hire. Hiring managers surveyed by Upwork said data science and analytics roles would be the hardest to hire for over the next six months (60%), followed by architecture and engineering (58%) and IT & networking (58%).


Serverless Is the New Timeshare

There’s one great use case I can think of: webhooks. Getting the duct tape code for webhooks is always a pain. They don’t trigger often and dealing with that is a chore. Using a serverless function to just add stuff to the database and do the work can be pretty simple. Since a callback is hard to debug anyway, the terrible debugging experience in serverless isn’t a huge hindrance. But for every other use case, I’m absolutely baffled. People spend so much time checking and measuring throughput yet just using one slightly larger server and only local calls will yield more throughput than you can possibly need. Without all the vendor tie-ins that we fall into. Hosting using Linode, Digital Ocean, etc. would save so much money. On the time-to-market aspect, just using caching and quick local tools would be far easier than anything you can build in the cloud. Containers are good progress and they made this so much simpler, yet we dropped the ball on this and went all in on complexity with stuff like Kubernetes. Don’t get me wrong. K8s are great. 


The 6 most overhyped technologies in IT

These CIOs say that metaverse enthusiasts, including vendors who have a stake in its promotion, have created a sense that this technology will have us all living in a new digital realm. Most aren’t buying it. “Could it turn out to be great? Well, possibly. But so many other things have to change in order for that to work,” says Bob Johnson, CIO of The American University of Paris, who extended his comments to include the related technologies of extended reality (XR), virtual reality (VR), and augmented reality (AR). “They have some wonderful applications, but they don’t change the way we live.” ... CIOs also labeled blockchain as overhyped, noting that the technology has failed to be as transformative or even as useful as hoped nearly a decade into its use. “Initially, the name ‘blockchain’ sounded pretty cool and quickly became a buzzword that drew interest and peeked curiosities,” says Josh Hamit, senior vice president and CIO of Altra Federal Credit Union and a member of ISACA’s Emerging Trends Working Group. “However, in actual practice, it has proved more difficult for many organizations to identify tangible use cases for blockchain, or distributed ledger as it is also known.”


How fusing power and process automation delivers operational resilience

The integration of power and process is a catalyst for operational resilience and improved sustainability across the lifecycle of the plant. This integrated, digitalised approach drives Electrical, Instrumentation and Control (EI&C) CAPEX reductions up to 20% and OPEX efficiencies, including decreased unplanned downtime up to 15%, in addition to improving bottom line profitability by three points. End users see energy procurement cost reductions of 2-5% and carbon footprint reductions of 7 – 12% when implementing these strategies. It offers a comprehensive view of asset performance management, energy management, and the value chain from design through construction, commissioning, operations, and maintenance. When undergoing such an integration effort, implementing the right strategies can improve operational resilience for better anticipation, prevention, recovery from, and adaptability to market dynamics and events. This plant-wide data collection, reliable control and command exchange between systems, operators and control room will empower the workforce with clear and verified decision-making.


Multi-stage crypto-mining malware hides in legitimate apps with month-long delay trigger

Once the user downloads and installs an app, the deployment of malicious payloads doesn't happen immediately, which is a strategy to avoid detection. First, the app installer, which is built with a free tool called Inno Setup, reaches out to the developer's website and downloads a password-protected RAR archive that contains the application files. These are deployed under the Program Files (x86)\Nitrokod\[application name] path. The app then checks for the presence of a component called update.exe. If it's not found, it deploys it under the Nitrokod folder and sets up a system scheduled task to execute it after every restart. The installer then collects some information about the victim's system and sends it to the developer's server. Up to this point, the installation is not very unusual for how a legitimate application would behave: collecting some system data for statistics purposes and deploying what looks like an automatic update component. However, after around four system restarts on four different days, update.exe downloads and deploys another component called chainlink1.07.exe.


The new work–life balance

The pandemic seemed to render work–life balance a laughable concept. As white-collar workers set up workstations at home, there was no longer a separation of job and personal time or space. So we need something new, something more useful, to help us think about balance in our lives. Here’s an alternative model. ... There is no right mix, per se, and each individual’s outlook will change over time. When we are in our 20s, we can indulge in more of what we want to do. The same is true later in life, when personal interests can be prioritized. It’s those decades of our 30s, 40s, and 50s that can be particularly challenging—raising a family and building a career, which will include jobs that are stepping stones to more fulfilling roles. These chapters of life gave rise to the widely cited U-shaped happiness curve. To me, that three-part pie chart is useful in determining whether we feel a sense of balance in our lives. And it also helps explain some of the meta-narratives of the moment, including the “great resignation” and the persistent desire of employees to work from home. All that time alone during pandemic lockdowns gave people time to consider the meaning of life and prompted many to quit unrewarding jobs.


Edge computing: 4 considerations for success

Automation is usually accomplished through automation workflows close to the edge endpoints and a centralized control layer. Localized execution guards against high latency and connection disruptions, while centralized control provides integrated control of the entire edge environment. ... The edge can become a bit like the Wild West if you let it. Even with automation and management systems in place, it still takes an architectural commitment to maintain a high degree of consistency across the edge (and datacenter) environment. One rationale for a lack of consistency is that devices at the periphery are often smaller and less powerful than servers in a data center. The reasoning then follows that they need to run different software. But this isn’t necessarily the case – or at least, it isn’t the whole story. You can build system images from the small core Linux operating system you run elsewhere and customize it to add exactly what you need in terms of drivers, extensions, and workloads. Images can then be versioned, tested, signed, and deployed as a unit, so your ops team can know exactly what is running on the devices.


How Observability Can Help Manage Complex IT Networks

“Everything in computing is difficult for humans to see, simply because humans are so much slower than any computer,” Morgan says. “Almost anything we can do to provide visibility into what’s really happening inside the application can be a big help in understanding.” This means not just fixing things that break, but improving things that are working, or explaining them to users and new developers. He points to the oldest observability tool, ad-hoc logging -- still in use today -- but adds tools like distributed tracing can provide a standard layer of visibility into the entire application without requiring application changes. This in turn reduces the burden on developers (less code to write) and on support staff (fewer distinct things to learn). “As an industry, we’ve created many tools for observability over the years, from print statements to distributed tracing,” Morgan says. “Network analytics bring a welcome uniformity to observability.” He adds that at a certain level, network traffic is the same no matter what the application is doing, so you can easily get equivalent transparency for every service in your application.


As States Ban Ransom Payments, What Could Possibly Go Wrong?

Victims may not know exactly what all ransomware attackers have encrypted or stolen, and finding out may take substantial time and energy. Likewise, negotiators can sometimes reduce the ransom being demanded by a large factor. In some cases, attackers may also provide a decryptor without a victim having to pay. Perhaps state legislators are attempting to look tough by essentially telling ransomware gangs to look elsewhere. No doubt they also don't want the political baggage associated with spending taxpayer money to enrich criminals. "A ransomware payment to the evil 'insert one of four known protagonists'-affiliated cybercriminals for multimillion-dollar amounts is bad optics at the political level when infrastructure is crumbling, inflation is climbing and social services such as policing and justice, healthcare, and other government services are under immense strain and financial pressure," says Ian Thornton-Trump , CISO of Cyjax. Previously, he says, many victims could pay for cleanup - and sometimes the ransom payment - using their cyber insurance or by making a business-disruption claim. 


Outdated infrastructure not up to today’s ransomware challenges

Challenges pertaining to outdated infrastructure could easily be compounded by the fact that many IT and security teams don’t seem to have a plan in place to mobilize if and when a cyber attack occurs. Nearly 60% of respondents expressed some level of concern that their IT and security teams would be able to mobilize efficiently to respond to the attack. These are just some of the findings from an April 2022 survey, conducted by Censuswide, of more than 2,000 IT and SecOps professionals (split nearly 50/50 between the two groups) in the United States, the United Kingdom, Australia and New Zealand. All respondents play a role in the decision-making process for IT or security within their organizations. “IT and security teams should raise the alarm bell if their organization continues to use antiquated technology to manage and secure their most critical digital asset – their data,” said Brian Spanswick, CISO, Cohesity. “Cyber criminals are actively preying on this outdated infrastructure as they know it was not built for today’s dispersed, multicloud environments, nor was it built to help companies protect and rapidly recover from sophisticated cyberattacks.”



Quote for the day:

"Speaking about it and doing it are not the same thing." -- Gordon Tredgold

Daily Tech Digest - August 29, 2022

6 key board questions CIOs must be prepared to answer

The board wants assurances that the CIO has command of tech investments tied to corporate strategy. “Demystify that connection,” Ferro says. “Show how those investments tie to the bigger picture and show immediate return as much as you can.” Global CIO and CDO Anupam Khare tries to educate the board of manufacturer Oshkosh Corp. in his presentations. “My slide deck is largely in the context of the business so you can see the benefit first and the technology later. That creates curiosity about how this technology creates value,” Khare says. “When we say, ‘This project or technology has created this operating income impact on the business,’ that’s the hook. Then I explain the driver for that impact, and that leads to a better understanding of how the technology works.” Board members may also come in with technology suggestions of their own that they hear about from competitors or from other boards they’re on. ... Avoid the urge to break out technical jargon to explain the merits of new cloud platforms, customer-facing apps, or Slack as a communication tool, and “answer that question from a business context, not from a technology context,” Holley says. “


From applied AI to edge computing: 14 tech trends to watch

Mobility has arrived at a “great inflection” point — a shift towards autonomous, connected, electric and smart technologies. This shift aims to disrupt markets while improving efficiency and sustainability of land and air transportation of people and goods. ACES technologies for road mobility saw significant adoption during the past decade, and the pace could accelerate because of sustainability pressures, McKinsey said. Advanced air-mobility technologies, on the other hand, are either in pilot phase — for example, airborne-drone delivery — or remain in the early stages of development — for example, air taxis — and face some concerns about safety and other issues. Overall, mobility technologies, which attracted $236bn last year, intend to improve the efficiency and sustainability of land and air transportation of people and goods. ... It focuses on the use of goods and services that are produced with minimal environmental impact by using low carbon technologies and sustainable materials. At a macro level, sustainable consumption is critical to mitigating environmental risks, including climate change. 


Why Memory Enclaves Are The Foundation Of Confidential Computing

Data encryption has been around for a long time. It was first made available for data at rest on storage devices like disk and flash drives as well as data in transit as it passed through the NIC and out across the network. But data in use – literally data in the memory of a system within which it is being processed – has not, until fairly recently, been protected by encryption. With the addition of memory encryption and enclaves, it is now possible to actually deliver a Confidential Computing platform with a TEE that provides data confidentiality. This not only stops unauthorized entities, either people or applications, from viewing data while it is in use, in transit, or at rest. ... It effectively allows enterprises in regulated industries as well as government agencies and multi-tenant cloud service providers to better secure their environments. Importantly, Confidential Computing means that any organization running applications on the cloud can be sure that any other users of the cloud capacity and even the cloud service providers themselves cannot access the data or applications residing within a memory enclave.


Metasurfaces offer new possibilities for quantum research

Metasurfaces are ultrathin planar optical devices made up of arrays of nanoresonators. Their subwavelength thickness (a few hundred nanometers) renders them effectively two-dimensional. That makes them much easier to handle than traditional bulky optical devices. Even more importantly, due to the lesser thickness, the momentum conservation of the photons is relaxed because the photons have to travel through far less material than with traditional optical devices: according to the uncertainty principle, confinement in space leads to undefined momentum. This allows for multiple nonlinear and quantum processes to happen with comparable efficiencies and opens the door for the usage of many new materials that would not work in traditional optical elements. For this reason, and also because of being compact and more practical to handle than bulky optical elements, metasurfaces are coming into focus as sources of photon pairs for quantum experiments. In addition, metasurfaces could simultaneously transform photons in several degrees of freedom, such as polarization, frequency, and path.


Agile: Starting at the top

Having strong support was key to this change in beliefs among the leadership team. Aisha Mir, IT Agile Operations Director for Thales North America, has a track record of successful agile transformations under her belt and was eager to help the leadership team overcome any initial hurdles. “The best thing I saw out of previous transformations I’ve been a part of was the way that the team started working together and the way they were empowered. I really wanted that for our team,” says Mir. “In those first few sprints, we saw that there were ways for all of us to help each other, and that’s when the rest of the team began believing. I had seen that happen before – where the team really becomes one unit and they see what tasks are in front of them – and they scrum together to finish it.” While the support was essential, one motivating factor helped them work through any challenge in their way: How could they ask other parts of the IT organization to adopt agile methodologies if they couldn’t do it themselves? “When we started, we all had some level of skepticism but were willing to try it because we knew this was going to be the life our organization was going to live,” says Daniel Baldwin


AutoML: The Promise vs. Reality According to Practitioners

The data collection, data tagging, and data wrangling of pre-processing are still tedious, manual processes. There are utilities that provide some time savings and aid in simple feature engineering, but overall, most practitioners do not make use of AutoML as they prepare data. In post-processing, AutoML offerings have some deployment capabilities. But Deployment is famously a problematic interaction between MLOps and DevOps in need of automation. Take for example one of the most common post-processing tasks: generating reports and sharing results. While cloud-hosted AutoML tools are able to auto-generate reports and visualizations, our findings show that users are still adopting manual approaches to modify default reports. The second most common post-processing task is deploying models. Automated deployment was only afforded to users of hosted AutoML tools and limitations still existed for security or end user experience considerations. The failure of AutoML to be end-to-end can actually cut into the efficiency improvements.


Best Practices for Building Serverless Microservices

There are two schools of thoughts when it comes to structuring your repositories for an application: monorepo vs multiple repos. A monorepo is a single repository that has logical separations for distinct services. In other words, all microservices would live in the same repo but would be separated by different folders. Benefits of a monorepo include easier discoverability and governance. Drawbacks include the size of the repository as the application scales, large blast radius if the master branch is broken, and ambiguity of ownership. On the flip side, having a repository per microservice has its ups and downs. Benefits of multiple repos include distinct domain boundaries, clear code ownership, and succinct and minimal repo sizes. Drawbacks include the overhead of creating and maintaining multiple repositories and applying consistent governance rules across all of them. In the case of serverless, I opt for a repository per microservice. It draws clear lines for what the microservice is responsible for and keeps the code lightweight and focused. 


5 Super Fast Ways To Improve Core Web Vitals

High-quality images consume more space. When the image size is big, your loading time will increase. If the loading time increases, the user experience will be affected. So, keeping the image size as small as possible is best. Compress the image size. If you have created your website using WordPress, you can use plugins like ShortPixel to compress the image size. If not, many online sites are available to compress image size. However, you might have a doubt - does compression affect the quality of the image? To some extent, yes, it will damage the quality, but only it will be visible while zooming in on the image. Moreover, use JPEG format for images and SVG format for logos and icons. It is even best if you can use WebP format. ... One of the important metrics of the core web vitals is the Cumulative Layout shift. Imagine that you're scrolling through a website on your phone. You think that it is all set to engage with it. Now, you see a text which has a hyperlink that has grasped your interest, and you're about to click it. When you click it, all of a sudden, the text disappears, and there is an image in the place of the text. 


Cyber-Insurance Firms Limit Payouts, Risk Obsolescence

While the insurers' position is understandable, businesses — which have already seen their premiums skyrocket over the past three years — should question whether insurance still mitigates risk effectively, says Pankaj Goyal, senior vice president of data science and cyber insurance at Safe Security, a cyber-risk analysis firm. "Insurance works on trust, [so answer the question,] 'will an insurance policy keep me whole when a bad event happens?' " he says. "Today, the answer might be 'I don't know.' When customers lose trust, everyone loses, including the insurance companies." ... Indeed, the exclusion will likely result in fewer companies relying on cyber insurance as a way to mitigate catastrophic risk. Instead, companies need to make sure that their cybersecurity controls and measures can mitigate the cost of any catastrophic attack, says David Lindner, chief information security officer at Contrast Security, an application security firm. Creating data redundancies, such as backups, expanding visibility of network events, using a trusted forensics firm, and training all employees in cybersecurity can all help harden a business against cyberattacks and reduce damages.


Data security hinges on clear policies and automated enforcement

The key is to establish policy guardrails for internal use to minimize cyber risk and maximize the value of the data. Once policies are established, the next consideration is establishing continuous oversight. This component is difficult if the aim is to build human oversight teams, because combining people, processes, and technology is cumbersome, expensive, and not 100% reliable. Training people to manually combat all these issues is not only hard but requires a significant investment over time. As a result, organizations are looking to technology to provide long-term, scalable, and automated policies to govern data access and adhere to compliance and regulatory requirements. They are also leveraging these modern software approaches to ensure privacy without forcing analysts or data scientists to “take a number” and wait for IT when they need access to data for a specific project or even everyday business use. With a focus on establishing policies and deciding who gets to see/access what data and how it is used, organizations gain visibility into and control over appropriate data access without the risk of overexposure. 



Quote for the day:

"Leadership is a journey, not a destination. It is a marathon, not a sprint. It is a process, not an outcome." -- John Donahoe

Daily Tech Digest - August 28, 2022

How to build a winning analytics team

Analytics teams thrive in dynamic environments that reward curiosity, encourage innovation, and set high expectations. Building and reinforcing this type of culture can help put organizations on a path to earning impressive returns from analytics investments. An active analytics culture thrives when CXOs reward curiosity over perfection. Encourage analysts to challenge convention and ask questions as a method to improve quality and reduce risks. This thinking goes hand in hand with a test-and-learn mentality, where pushing boundaries through proactive experimentation helps identify what works, and optimize accordingly. It’s also important to create a culture where failure and success are celebrated equally. Giving airtime to what went wrong allows the team to more effectively learn from their mistakes and see that perfection is an unhealthy pipe dream. This encourages an environment that holds analysts accountable for delivering quality processes and results, further helping to mitigate risk and improve marketing programs.


How SSE Renewables uses Azure Digital Twins for more than machines

This approach will allow SSE to experiment with reducing risks to migrating birds. For example, they can determine an optimum blade speed that will allow flocks to pass safely while still generating power. By understanding the environment around the turbines, it will be possible to control them more effectively and with significantly less environmental impact. Simon Turner, chief technology officer for data and AI at Avanade, described this approach as “an autonomic business.” Here, data and AI work together to deliver a system that is effectively self-operating, one he described as using AI to “look after certain things that you understood that could guide the system to make decisions on your behalf.” Key to this approach is extending the idea of a digital twin with machine learning and large-scale data. ... As Turner notes, this approach can be extended to more than wind farms, using it to model any complex system where adding new elements could have a significant effect, such as understanding how water catchment areas work or how hydroelectric systems can be tuned to let salmon pass unharmed on their way to traditional breeding grounds, while still generating power.


McKinsey report: Two AI trends top 2022 outlook

Roger Roberts, partner at McKinsey and one of the report’s coauthors, said of applied AI, which is defined “quite broadly” in the report, “We see things moving from advanced analytics towards… putting machine learning to work on large-scale datasets in service of solving a persistent problem in a novel way,” he said. That move is reflected in an explosion of publication around AI, not just because AI scientists are publishing more, but because people in a range of domains are using AI in their research and pushing the application of AI forward, he explained. ... According to the McKinsey report, industrializing machine learning (ML) “involves creating an interoperable stack of technical tools for automating ML and scaling up its use so that organizations can realize its full potential.” The report noted that McKinsey expects industrializing ML to spread as more companies seek to use AI for a growing number of applications. “It does encompass MLops, but it extends more fully to include the way to think of the technology stack that supports scaling, which can get down to innovations at the microprocessor level,” said Roberts. 


CISA: Prepare now for quantum computers, not when hackers use them

The main negative implication of this quantum computing concerns the cryptography of secrets, a fundamental element of information security. Cryptographic schemes that are today considered secure will be cracked in mere seconds by quantum computers, leaving persons, companies, and entire countries powerless against the computing supremacy of their adversaries. “When quantum computers reach higher levels of computing power and speed, they will be capable of breaking public key cryptography, threatening the security of business transactions, secure communications, digital signatures, and customer information,” explains CISA. This could threaten data in transit relating to top-secret communications, banking operations, military operations, government meetings, critical industrial processes, and more. Yesterday, China's Baidu introduced “Qian Shi,” an industry-level quantum supercomputer capable of achieving stable performance at 10 quantum bits of power.


How Are Business Intelligence And Data Management Related?

Business intelligence (BI) describes the procedures and tools that assist in getting helpful information and intelligence that can be used from data. A company’s data is accessed by business intelligence tools, which then display analytics and insights as reports, dashboards, graphs, summaries, and charts. Business intelligence has advanced significantly from its theoretical inception in the 1950s, and you must realize that it is not just a tool for big businesses. Most BI providers are tailoring their software to users’ needs because they recognize that our current era is considerably more oriented toward small structures like start-ups. SaaS, or software-as-a-service, vendors are incredibly guilty of this. Another issue is that it’s a more straightforward tool than it once was. It is still a professional tool; managing data is not simple, even with the most powerful technology. Nevertheless, BI has developed into something more accessible than local software, which used to require installation on every computer in the organization and may represent a sizable expenditure with the emergence of the Cloud and SaaS in the early 21st century.


Oxford scientist says greedy physicists have overhyped quantum computing

It’s unclear why Dr. Gourianov would leave big tech out of the argument entirely. There are dozens upon dozens of papers from Google and IBM alone demonstrating breakthrough after breakthrough in the field. Gourianov’s primary argument against quantum computing appears, inexplicably, to be that they won’t be very useful for cracking quantum-resistant encryption. With respect, that’s like saying we shouldn’t develop surgical scalpels because they’re practically useless against chain mail armor. Per Gourianov’s article: Shor’s algorithm has been a godsend to the quantum industry, leading to untold amounts of funding from government security agencies all over the world. However, the commonly forgotten caveat here is that there are many alternative cryptographic schemes that are not vulnerable to quantum computers. It would be far from impossible to simply replace these vulnerable schemes with so-called “quantum-secure” ones. This appears to suggest that Gourianov believes at least some physicists have pulled a bait-and-switch on governments and investors by convincing everyone that we need quantum computers for security.


Computer vision is primed for business value

In healthcare, computer vision is used extensively in diagnostics, such as in AI-powered image and video interpretation. It is also used to monitor patients for safety, and to improve healthcare operations, says Gartner analyst Tuong Nguyen. “The potential for computer vision is enormous,” he says. “It’s basically helping machines make sense of the world. The applications are infinite — really, anything you need to see. The entire world.” According to the fourth annual Optum survey on AI in healthcare, released at the end of 2021, 98% of healthcare organizations either already have an AI strategy or are planning to implement one, and 99% of healthcare leaders believe AI can be trusted for use in health care. Medical image interpretation was one of the top three areas cited by survey respondents where AI can be used to improve patient outcomes. The other two areas, virtual patient care and medical diagnosis, are also ripe for computer vision. Take, for example, idiopathic pulmonary fibrosis, a deadly lung disease that affects hundreds of thousands of people worldwide.


AI Therapy: Digital Solution to Address Mental Health Issues

AI for health has been a long-discussed topic specifically on therapy by bringing digital solutions to mental health issues. Some applications have already been, such as Genie in a Headset which manages human emotional behavior in work environments. But bringing AI into therapy means building an AI that feels and is keen to improve mental health issues. The fundamental objective of AI therapy is to assist patients in fighting mental illnesses. Ideally, this technology would be able to distinguish each patients needs and personalize their mental health programs through an efficient data collection process. ... Psychological therapy is a tough job that requires extracting confidential information from patients they hesitate to share. Like any other medical issue, it is essential to diagnose the problem before curing it. It requires exquisite skills to make someone comfortable. An AI therapist can access your cellphone, laptop, personal data, emails, all-day movement, and routine, making it more efficient in understanding you and your problems. Knowing problems in depth gives an AI-therapist advantage over the usual therapist.


What is the Microsoft Intelligent Data Platform?

The pieces that make up the Microsoft Intelligent Data Platform are services you may already be using because it includes all of Microsoft’s key data services, such as SQL Server 2022, Azure SQL, Cosmos DB, Azure Synapse, Microsoft Purview and more. But you’re probably not using them together as well as you could; the Intelligent Data Platform is here to make that easier. “These are the best-in-class services across what we consider the three core pillars of a data platform,” Mansour explained. According to Mansour, the Microsoft Intelligent Data Platform offers services for databases and operational data store, analytics, and data governance, providing authorized users with insight that will allow them to properly understand, manage and govern their business’s data. “Historically, customers have been thinking about each of those areas independent from one another, and what the Intelligent Data Platform does is bring all these pieces together,” said Mansour. Integrating databases, analytics and governance isn’t new either, but the point of presenting this as a platform is the emphasis on simplifying the experience of working with it. 


Threatening clouds: How can enterprises protect their public cloud data?

Public clouds don’t inherently impose security threats, said Gartner VP analyst Patrick Hevesi — in fact, hyperscale cloud providers usually have more security layers, people and processes in place than most organizations can afford in their own data centers. However, the biggest red flag for organizations when selecting a public cloud provider is the lack of visibility into their security measures, he said. Some of the biggest issues in recent memory: Misconfigurations of cloud storage buckets, said Hevesi. This has opened files up for data exfiltration. Some cloud providers have also had outages due to misconfigurations of identity platforms. This has affected their cloud services from starting up properly, which in turn affected tenants. Smaller cloud providers, meanwhile, have been taken offline due to distributed denial-of-service (DDoS) attacks. This is when perpetrators make a machine or network resource unavailable to intended users by disrupting services — either short-term or long-term — of a host connected to a network.



Quote for the day:

“Real integrity is doing the right thing, knowing that nobody’s going to know whether you did it or not.” -- Oprah Winfrey

Daily Tech Digest - August 27, 2022

Intel Hopes To Accelerate Data Center & Edge With A Slew Of Chips

McVeigh noted that Intel’s integrated accelerators will be complemented by the upcoming discrete GPUs. He called the Flex Series GPUs “HPC on the edge,” with their low power envelopes, and pointed to Ponte Vecchio – complete with 100 billion transistors in 47 chiplets that leverage both Intel 7 manufacturing processes as well as 5 nanometer and 7 nanometer processes from Taiwan Semiconductor Manufacturing Co – and then Rialto Bridge. Both Ponte Vecchio and Sapphire Rapids will be key components in Argonne National Labs’ Aurora exascale supercomputer, which is due to power on later this year and will deliver more than 2 exaflops of peak performance. .... “Another part of the value of the brand here is around the software unification across Xeon, where we leverage the massive amount of capabilities that are already established through decades throughout that ecosystem and bring that forward onto our GPU rapidly with oneAPI, really allowing for both the sharing of workloads across CPU and GPU effectively and to ramp the codes onto the GPU faster than if we were starting from scratch,” he said.


Performance isolation in a multi-tenant database environment

Our multi-tenant Postgres instances operate on bare metal servers in non-containerized environments. Each backend application service is considered a single tenant, where they may use one of multiple Postgres roles. Due to each cluster serving multiple tenants, all tenants share and contend for available system resources such as CPU time, memory, disk IO on each cluster machine, as well as finite database resources such as server-side Postgres connections and table locks. Each tenant has a unique workload that varies in system level resource consumption, making it impossible to enforce throttling using a global value. This has become problematic in production affecting neighboring tenants:Throughput. A tenant may issue a burst of transactions, starving shared resources from other tenants and degrading their performance. Latency: A single tenant may issue very long or expensive queries, often concurrently, such as large table scans for ETL extraction or queries with lengthy table locks. Both of these scenarios can result in degraded query execution for neighboring tenants. Their transactions may hang or take significantly longer to execute due to either reduced CPU share time, or slower disk IO operations due to many seeks from misbehaving tenant(s).


Quantum Encryption Is No More A Sci-Fi! Real-World Consequences Await

Quantum will enable enterprise customers to perform complex simulations in significantly less time than traditional software using quantum computers. Quantum algorithms are very challenging to develop, implement, and test on current Quantum computers. Quantum techniques also are being used to improve the randomness of computer-based random number generators. The world’s leading quantum scientists in the field of quantum information engineering, working to turn what was once in the realm of science fiction. Businesses need to deploy next-generation data security solutions with equally powerful protection based on the laws of quantum physics, literally fighting quantum computers with quantum encryption Quantum computers today are no longer considered to be science fiction. The main difference is that quantum encryption uses quantum bits or qubits comprised of optical photons compared to electrical binary digits or bits. Qubits can also be inextricably linked together using a phenomenon called quantum entanglement.


What Is The Difference Between Computer Vision & Image processing?

We are constantly exposed to and engaged with various visually similar objects around us. By using machine learning techniques, the discipline of AI known as computer vision enables machines to see, comprehend, and interpret the visual environment around us. It uses machine learning approaches to extract useful information from digital photos, movies, or other observable inputs by identifying patterns. Although they have the same appearance and sensation, they differ in a few ways. Computer vision aims to distinguish between, classify, and arrange images according to their distinguishing characteristics, such as size, color, etc. This is similar to how people perceive and interpret images. ... Digital image processing uses a digital computer to process digital and optical images. A computer views an image as a two-dimensional signal composed of pixels arranged in rows and columns. A digital image comprises a finite number of elements, each located in a specific place with a particular value. These so-called elements are also known as pixels, visual, and image elements.


Lessons in mismanagement

In the decades since the movie’s release, the world has become a different place in some important ways. Women are now everywhere in the world of business, which has changed irrevocably as a result. Unemployment is quite low in the United States and, by Continental standards, in Europe. Recent downturns have been greeted by large-scale stimuli from central banks, which have blunted the impact of stock market slides and even a pandemic. But it would be foolish to think that the horrendous managers and desperate salesmen of Glengarry Glen Ross exist only as historical artifacts. Mismanagement and desperation go hand in hand and are most apparent during hard times, which always come around sooner or later. By immersing us in the commercial and workplace culture of the past, movies such as Glengarry can help us understand our own business culture. But they can also help prepare us for hard times to come—and remind us how not to manage, no matter what the circumstances. ... Everyone, in every organization, has to perform. 


How the energy sector can mitigate rising cyber threats

As energy sector organisations continue expanding their connectivity to improve efficiency, they must ensure that the perimeters of their security processes keep up. Without properly secured infrastructure, no digital transformation will ever be successful, and not only internal operations, but also the data of energy users are bound to become vulnerable. But by following the above recommendations, energy companies can go a long way in keeping their infrastructure protected in the long run. This endeavour can be strengthened further by partnering with cyber security specialists like Dragos, which provides an all-in-one platform that enables real-time visualisation, protection and response against ever present threats to the organisation. These capabilities, combined with threat intelligence insights and supporting services across the industrial control system (ICS) journey, is sure to provide peace of mind and added confidence in the organisation’s security strategy. For more information on Dragos’s research around cyber threat activity targeting the European energy sector, download the Dragos European Industrial Infrastructure Cyber Threat Perspective report, here.


How to hire (and retain) Gen Z talent

The global pandemic has forever changed the way we work. The remote work model has been successful, and we’ve learned that productivity does not necessarily decrease when managers and their team members are not physically together. This has been a boon for Gen Z – a generation that grew up surrounded by technology. Creating an environment that gives IT employees the flexibility to conduct their work remotely has opened the door to a truly global workforce. Combined with the advances in digital technologies, we’ve seen a rapid and seamless transition in how employment is viewed. Digital transformation has leveled the playing field for many companies by changing requirements around where employees need to work. Innovative new technologies, from videoconferencing to IoT, have shifted the focus from an employee’s location to their ability. Because accessing information and managing vast computer networks can be done remotely, the location of workers has become a minor issue.


'Sliver' Emerges as Cobalt Strike Alternative for Malicious C2

Enterprise security teams, which over the years have honed their ability to detect the use of Cobalt Strike by adversaries, may also want to keep an eye out for "Sliver." It's an open source command-and-control (C2) framework that adversaries have increasingly begun integrating into their attack chains. "What we think is driving the trend is increased knowledge of Sliver within offensive security communities, coupled with the massive focus on Cobalt Strike [by defenders]," says Josh Hopkins, research lead at Team Cymru. "Defenders are now having more and more successes in detecting and mitigating against Cobalt Strike. So, the transition away from Cobalt Strike to frameworks like Sliver is to be expected," he says. Security researchers from Microsoft this week warned about observing nation-state actors, ransomware and extortion groups, and other threat actors using Sliver along with — or often as a replacement for — Cobalt Strike in various campaigns. Among them is DEV-0237, a financially motivated threat actor associated with the Ryuk, Conti, and Hive ransomware families; and several groups engaged in human-operated ransomware attacks, Microsoft said.


Data Management in the Era of Data Intensity

When your data is spread across multiple clouds and systems, it can introduce latency, performance, and quality problems. And bringing together data from different silos and getting those data sets to speak the same language is a time- and budget-intensive endeavor. Your existing data platforms also may prevent you from managing hybrid data processing, which, as Ventana Research explains, “enable[s] analysis of data in an operational data platform without impacting operational application performance or requiring data to be extracted to an external analytic data platform.” The firm adds that: “Hybrid data processing functionality is becoming increasingly attractive to aid the development of intelligent applications infused with personalization and artificial intelligence-driven recommendations.” Such applications are clearly important because they can be key business differentiators and enable you to disrupt a sector. However, if you are grappling with siloed systems and data and legacy technology that is unable to ingest high volumes of complex data fast so that you can act in the moment, you may believe that it is impossible for your business to benefit from the data synergies that you and your customers might otherwise enjoy.


How to Achieve Data Quality in the Cloud

Everybody knows data quality is essential. Most companies spend significant money and resources trying to improve data quality. However, despite these investments, companies lose money yearly because of insufficient data, ranging from $9.7 million to $14.2 million annually. Traditional data quality programs do not work well for identifying data errors in cloud environments because:Most organizations only look at the data risks they know, which is likely only the tip of an iceberg. Usually, data quality programs focus on completeness, integrity, duplicates and range checks. However, these checks only represent 30 to 40 percent of all data risks. Many data quality teams do not check for data drift, anomalies or inconsistencies across sources, contributing to over 50 percent of data risks. The number of data sources, processes and applications has exploded because of the rapid adoption of cloud technology, big data applications and analytics. These data assets and processes require careful data quality control to prevent errors in downstream processes. The data engineering team can add hundreds of new data assets to the system in a short period. 



Quote for the day:

"Problem-solving leaders have one thing in common: a faith that there's always a better way." -- Gerald M. Weinberg

Daily Tech Digest - August 26, 2022

CISA: Just-Disclosed Palo Alto Networks Firewall Bug Under Active Exploit

Bud Broomhead, CEO at Viakoo, says bugs that can be marshaled into service to support DDoS attacks are in more and more demand by cybercriminals -- and are increasingly exploited. "The ability to use a Palo Alto Networks firewall to perform reflected and amplified attacks is part of an overall trend to use amplification to create massive DDoS attacks," he says. "Google's recent announcement of an attack which peaked at 46 million requests per second, and other record-breaking DDoS attacks will put more focus on systems that can be exploited to enable that level of amplification." The speed of weaponization also fits the trend of cyberattackers taking increasingly less time to put newly disclosed vulnerabilities to work — but this also points to an increased interest in lesser-severity bugs on the part of threat actors. "Too often, our researchers see organizations move to patch the highest-severity vulnerabilities first based on the CVSS," Terry Olaes, director of sales engineering at Skybox Security, wrote in an emailed statement. 


Kestrel: The Microsoft web server you should be using

Kestrel is an interesting option for anyone building .NET web applications. It’s a relatively lightweight server compared to IIS, and as it’s cross-platform, it simplifies how you might choose a hosting platform. It's also suitable as a development tool, running on desktop hardware for tests and experimentation. There’s support for HTTPS, HTTP/2, and a preview release of QUIC, so your code is future-proof and will run securely. The server installs as part of ASP.NET Core and is the default for sites that aren’t explicitly hosted by IIS. You don’t need to write any code to launch Kestrel, beyond using the familiar WebApplication.CreateBuilder method. Microsoft has designed Kestrel to operate with minimal configuration, either using a settings file that’s created when you use dotnet new to set up an app scaffolding or when you create a new app in Visual Studio. Apps are able to configure Kestrel using the APIs in WebApplication and WebApplicationBuilder, for example, adding additional ports. As Kestrel doesn’t run until your ASP.NET Core code runs, this is a relatively easy way to make server configuration dynamic, with any change simply requiring a few lines of code. 


Private 5G networks bring benefits to IoT and edge

Private 5G's potential in enterprise use cases that involve IoT and edge computing is not without challenges that the industry must address; a production-level system requires many touchpoints. Private 5G networks must be planned, deployed, verified and managed by service providers, system integrators and IT teams. Edge computing is a combination of hardware and software. Each of these elements can fail, so they must be maintained and upgraded practically without any downtime, especially for real-time, mission-critical applications. Admins must manage edge deployments with containers or VM orchestration. Both public cloud vendors and managed open source vendors are addressing this space by providing a virtual edge computing framework for application developers. Public cloud vendors have also started to provide out-of-the-box edge infrastructure that runs the same software tools that run on their public cloud, which can make it easier for developers. For private 5G, IoT and edge to be successful, the industry must develop an extensive roadmap. Many of these solutions require long-term maintenance and upgrades.


Google is exiting the IoT services business. Microsoft is doing the opposite

Google will be shuttering its IoT Core service; the company disclosed last week. Its stated reason: Partners can better manage customers' IoT services and devices. While Microsoft also is relying heavily on partners as part of its IoT and edge-computing strategies, it is continuing to build up its stable of IoT services and more tightly integrate them with Azure. CEO Satya Nadella's "intelligent cloud/intelligent edge" pitch is morphing into more of an intelligent end-to-end distributed-computing play. ... Among Microsoft's current IoT offerings: Azure IoT Hub, a service for connecting, monitoring and managing IoT assets; Azure Digital Twins, which uses "spatial intelligence" to model physical environments; Azure IoT Edge, which brings analytics to edge-computing devices; Azure IoT Central; Windows for IoT, which enables users to build edge solutions using Microsoft tools. On the IoT OS front, Microsoft has Azure RTOS, its real-time IoT platform; Azure Sphere, its Linux-based microcontroller OS platform and services; Windows 11 IoT Enterprise and Windows 10 IoT Core -- a legacy IoT OS platform which Microsoft still supports but which hasn't been updated substantially since 2018.


Twitter's Ex-Security Chief Files Whistleblower Complaint

Zatko's complaint alleges that numerous security problems remained unresolved when he left. It also alleges that Twitter had been "penetrated by foreign intelligence agents," including Indian government agents as well as another, unnamed foreign intelligence agency. A federal jury recently found a former Twitter employee guilty of acting as an unregistered agent for Saudi Arabia while at the company. In his February final report to Twitter, Zatko alleged that "inaccurate and misleading" information concerning "Twitter's information security posture" had been transmitted to the company's risk committee, which risked the company making inaccurate reports to regulators, including the FTC. According to his report, the risk committee had been told that "nearly all Twitter endpoints (laptops) have security software installed." But he said the report failed to mention that of about 10,000 systems, 40% were not in compliance with "basic security settings," and 30% "do not have automatic updates enabled."


Announcing built-in container support for the .NET SDK

Containers are an excellent way to bundle and ship applications. A popular way to build container images is through a Dockerfile – a special file that describes how to create and configure a container image. ... This Dockerfile works very well, but there are a few caveats to it that aren’t immediately apparent, which arise from the concept of a Docker build context. The build context is a the set of files that are accessible inside of a Dockerfile, and is often (though not always) the same directory as the Dockerfile. If you have a Dockerfile located beside your project file, but your project file is underneath a solution root, it’s very easy for your Docker build context to not include configuration files like Directory.Packages.props or NuGet.config that would be included in a regular dotnet build. You would have this same situation with any hierarchical configuration model, like EditorConfig or repository-local git configurations. This mismatch between the explicitly-defined Docker build context and the .NET build process was one of the driving motivators for this feature. 


The Quantum Computing Threat: Risks and Responses

Asymmetric cryptographic systems are most at risk, implying that today’s public key infrastructure that form the basis of almost all of our security infrastructure would be compromised. That being said, the level of risk may be different depending on the data to be protected – for instance, a life insurance policy that will be valid for many years to come; a smart city that is built for our next generation. Similarly, the financial system, both centralized and decentralized, may have different vulnerabilities. For this reason, post-quantum security should be addressed as part of an organization’s overall cybersecurity strategy. It is of such importance that both the C-suite and the board should pay attention. While blockchain-based infrastructures are still considered safe, being largely hash-based, transactions are digitally signed using traditional encryption technologies such as elliptic curve and therefore could be quantum-vulnerable at the end points. Blockchain with quantum-safe features will no doubt gain more traction as NFTs, metaverse and crypto-assets continue to mature.


‘Post-Quantum’ Cryptography Scheme Is Cracked on a Laptop

It’s impossible to guarantee that a system is unconditionally secure. Instead, cryptographers rely on enough time passing and enough people trying to break the problem to feel confident. “That does not mean that you won’t wake up tomorrow and find that somebody has found a new algorithm to do it,” said Jeffrey Hoffstein, a mathematician at Brown University. Hence why competitions like NIST’s are so important. In the previous round of the NIST competition, Ward Beullens, a cryptographer at IBM, devised an attack that broke a scheme called Rainbow in a weekend. Like Castryck and Decru, he was only able to stage his attack after he viewed the underlying mathematical problem from a different angle. And like the attack on SIDH, this one broke a system that relied on different mathematics than most proposed post-quantum protocols. “The recent attacks were a watershed moment,” said Thomas Prest, a cryptographer at the startup PQShield. They highlight how difficult post-quantum cryptography is, and how much analysis might be needed to study the security of various systems.


Intel Adds New Circuit to Chips to Ward Off Motherboard Exploits

Under normal operations, once the microcontrollers activate, the security engine loads its firmware. In this motherboard hack, attackers attempt to trigger an error condition by lowering the voltage. The resulting glitch gives attackers the opportunity to load malicious firmware, which provides full access to information such as biometric data stored in trusted platform module circuits. The tunable replica circuit protects systems against such attacks. Nemiroff describes the circuit as a countermeasure to prevent the hardware attack by matching the time and corresponding voltage at which circuits on a motherboard are activated. If the values don't match, the circuit detects an attack and generates an error, which will cause the chip's security layer to activate a failsafe and go through a reset. "The only reason that could be different is because someone had slowed down the data line so much that it was an attack," Nemiroff says. Such attacks are challenging to execute because attackers need to get access to the motherboard and attach components, such as voltage regulators, to execute the hack.


Why Migrating a Database to the Cloud is Like a Heart Transplant

Your migration project’s enemies are surprises. There are numerous differences between databases from number conversions to date/time handling, to language interfaces, to missing constructs, to rollback behavior, and many others. Proper planning will look at all the technical differences and plan for them. Database migration projects also require time and effort, according to Ramakrishnan, and if they are rushed the results will not be what anyone wants. He recommended that project leaders create a single-page cheat sheet to break down the scope and complexity of the migration to help energize the team. It should include the project’s goals, the number of users impacted, the reports that will be affected by the change, the number of apps it touches, and more. Before embarking on the project, organizations should ask the following question: “How much will it cost to recoup the investment in the new database migration?” Organizations need to check that the economics are sound, and that means also analyzing the opportunity cost for not completing the migration.



Quote for the day:

"Do not follow where the path may lead. Go instead where there is no path and leave a trail." -- Muriel Strode