Daily Tech Digest - March 09, 2022

Small Biz Takes Digital Highway

Data protection and cybersecurity issues will also take centre stage once small and medium businesses adopt digitisation at a larger scale. He says cyber attacks and complying with laws and policies will require companies to build mechanisms which entail considerable, if not hefty, costs. In fact, small and medium businesses seem to be already bearing the brunt of cyber attacks. A study by Cisco published in September 2021 that sampled about 1,014 local businesses showed that about 74% small and medium businesses had faced a cyber incident in past 12 months. “At the end of the day, digital is here to stay, and nobody can ignore that. I am sure the service segment will evolve to offer solutions at affordable prices,” says Subbiah. Costs, after all, are a big factor for smaller companies, though companies are more than willing to spend on adding technology capabilities due to high return on investment. ... The bulk of the investment goes into cloud, automation and modern infrastructure, say analysts. “Specifically, within cloud, SaaS adoption is seeing acceleration as it entails lower costs and entry barriers,” says Abhinav Johri, director and practice head, digital consulting, EY.


2.5 million-plus cybersecurity jobs are open—women can fill them

Encouraging and nurturing the careers of women in cybersecurity is important for a number of reasons: Our cyber adversaries come from diverse backgrounds, which means that our defender community must be equally diverse in order to understand and succeed against them. We are facing a massive talent shortage in cybersecurity of more than 2.5 million job openings. This is putting a strain on security teams and organizations of every size. We can vastly decrease the deficit by deliberately expanding our hiring and mentorship of underrepresented groups who can bring so much to the table. Innovation is everything! And what’s more conducive to innovation than bringing together new perspectives, ideas, and experiences to solve today’s challenges? Cybersecurity depends on it because cybercrime tactics keep evolving. In fact, an MIT Technology Review article referred to cybersecurity versus cybercrime as “an innovation war.” Studies show that diversity of thought and leadership is just good for business.


IoT comes of age

Cities are near and dear to my heart as a former municipal CIO [chief information officer]. One of the challenges that we’ve seen in a number of large cities around the world is the amount of traffic congestion in the center of cities. A number of different cities have applied congestion pricing. They are tracking when vehicles are in the center of the city and charging for the times when congestion is highest. That doesn’t necessarily make the driver happy, but we have seen material changes in traffic patterns within those cities that have invested in congestion pricing. ... What we saw happen all too often was IoT being treated as a technology project, often run by the CIO or by a small business unit or factory plant all by themselves. And so the technology has changed, but the actual way of work has not. When we look at some of the lighthouse factories that Michael referenced earlier from the World Economic Forum, we see that they treat the integration of IoT as a holistic operating model transformation. When they look at how systems and processes are going to change on the factory floor, for example, they think about how they may need to motivate individuals working within that system differently. 


Critical flaws in remote management agent impacts thousands of medical devices

Forescout has identified over 150 potentially vulnerable devices using Axeda from over 100 different manufacturers. Over half of the devices are used in healthcare, specifically lab equipment, surgical equipment, infusion, radiotherapy, imaging and more. Others were found in the financial services, retail, manufacturing and other industries and include ATMs, vending machines, cash management systems, label printers, barcode scanning systems, SCADA systems, asset monitoring and tracking solutions, IoT gateways and machines such as industrial cutters. The seven vulnerabilities, which Forescout has dubbed Access:7 include three critical ones that can result in remote code execution. One vulnerability (CVE-2022-25251) stems from unauthenticated commands present in the Axeda xGate.exe agent that allow an attacker to retrieve information about a device and change the agent's configuration. By changing the configuration, an attacker could point the agent to a server they control and hijack the functionality.


How to approach cloud compliance monitoring

One common strategy is to use the data collected by cloud and network monitoring tools to create a centralized view of compliance status across all these domains. This approach aligns well with current cloud and network monitoring practices. To start a cloud compliance monitoring strategy, divide the tasks identified above. Some are design-time considerations. Here, an application will meet or fall short of compliance standards based on how developers build it. Others are run-time considerations, meaning the application requires surveillance during operations to validate compliance. The specific tools and procedures an organization applies to its cloud applications depend on how compliance requirements map to these categories. Enforce design-time compliance standards into the development pipeline, and validate them through logging and version monitoring. The former requires a systematic way to initiate, execute, review, test and deploy cloud software. Teams must identify tools that enforce and document the requirements of each applicable standard. 


Ukraine Fighting First-Ever 'Hybrid War' - Cyber Official

Ukraine continues to fight not just on the ground and in the air, but also online. "This is happening for the first time in history and I believe that cyber war can only be ended with the end of conventional war, and we will do everything we can to bring this moment closer," SSSCIP's Zhora said at a Friday press conference, the BBC reported. Zhora said Ukrainian cyber defenders continue to repel attacks on the country's online services and infrastructure, and said that "they are not afraid of Russian" attacks focused on such critical infrastructure as power plants or nuclear facilities, the BBC reported. Internet access remains shaky across Ukraine, due in part to continued bombing, says Britain's Ministry of Defense. "Ukrainian internet access is … highly likely being disrupted as a result of collateral damage from Russian strikes on infrastructure," it says. "Over the past week, internet outages have been reported in Mariupol, Sumy, Kyiv and Kharkiv." ... "Russia is probably targeting Ukraine's communications infrastructure in order to reduce Ukrainian citizens' access to reliable news and information," it adds.


7 reasons to embrace Web3 — and 7 reasons not to

Just because Bitcoin wastes so much energy doesn’t mean that Web3 will need to do the same. There are many protocols that offer some genuine assurance of correctness without requiring a bazillion transistors to be constantly solving some mathematical puzzle. Proof of stake, for example, is a neutral, decentralized protocol. It may not be perfect, but maybe we can get by with an adequate consensus model for a number of parts of Web3? Many people might be just as happy with blockchain managed by a coalition of trusted parties. It may not be theoretically free of domination, but if the coalition is big enough and the process is open, it could be embraced at a much lower cost in energy, silicon, and time. ... Our society is increasingly driven by data. Anything we can do to increase the accuracy of the data will help everyone who uses the information to make decisions. One of the side effects of adding more robust digital signatures and protocols to every interaction means that there will be more structure. ... Web3 is bound to have more accurate information and that will lift every part of the web that depends upon it.


The Uncertain Future of IT Automation

As Automox predicted at the end of last year, IT and security transformation continue as organizations everywhere try to find a new normal following the disruptions of the pandemic, and IT automation will have to adjust. This has been challenging for many organizations — and more importantly, people, as discussed above — but there are silver linings too. The pandemic has pushed new innovation across many areas, with exciting new tools and practices on the horizon for IT and security teams. One innovation that is particularly interesting is cybersecurity mesh architectures. Gartner has claimed that “organizations adopting a cybersecurity mesh architecture will reduce the financial impact of security incidents by an average of 90 percent” by 2024. A cybersecurity mesh architecture leverages various parts of the enterprise to integrate widely distributed, disparate security services. This is key to managing and accounting for a workforce that has never been more remote and globally distributed.


Predicting the future of AI and analytics in endpoint security

What’s troubling about Unit 42’s findings for endpoints is that 40% of enterprises are still using spreadsheets to track digital certificates manually, and 57% of enterprises don’t have an accurate inventory of SSH keys. These two factors contribute to the widening gap in endpoint security that bad actors are highly skilled at exploiting. It’s common to find organizations that aren’t tracking up to 40% of their endpoints, according to a recent interview with Jim Wachhaus, attack surface protection evangelist at CyCognito. Jim told VentureBeat that it’s common to find organizations generating thousands of unknown endpoints a year. Supporting Jim’s findings are CISOs who tell VentureBeat that keeping track of every endpoint defies what can be done through manually-based processes today as their IT staffs are already stretched thin. Add to that how CIOs and CISOs are battling a chronic labor shortage as their best employees are offered 40% or more of their base salary and up to $10,000 signing bonuses to jump to a new company, and the severity of the situation becomes clear. In addition, 56% of executives say their cybersecurity analysts are overwhelmed, according to BCG.


New attack bypasses hardware defenses for Spectre flaw in Intel and ARM CPUs

To mitigate the risk, software vendors such as Google and the Linux kernel developers came up with software-based solutions such as retpoline. While these were effective, they introduced a significant performance hit, so CPU vendors later developed hardware-based defenses. Intel's is called EIBRS and ARM's is called CSV2. "These solutions are complex to learn more about them—but the gist of them is that the predictor 'somehow' keeps track of the privilege level (user/kernel) in which a target is executed," the VUSec researchers explain. "And, as you may expect, if the target belongs to a lower privilege level, kernel execution won’t use it." The problem, however, is that the CPU's predictor relies on a global history to select the target entries to speculatively execute and, as the VUSec researchers proved, this global history can be poisoned. In other words, while the original Spectre v2 allowed attackers to actually inject target code locations and then trick the kernel to execute that code, the new Spectre-BHI/BHB attack can only force the kernel to mispredict and execute interesting code gadgets or snippets that already exist in the history and were executed in the past, but which might leak data.



Quote for the day:

"A pat on the back is only a few vertebrae removed from a kick in the pants, but is miles ahead in results." -- W. Wilcox

Daily Tech Digest - March 08, 2022

Towards Artificial General Intelligence

Of course, there is no way a machine can feel and experience thoughts like a human, but it can compute and relate concepts, and encode human-like experience (e.g., a snake is dangerous and scary, therefore it must be avoided). So, what might be the solution in developing such relational networks, which could bring about a general form of AI called artificial general intelligence (AGI), which could 'think' like a human and in the way which was proposed in Dartmouth College in 1956? Simply more parameters in a neural network? Recent work conducted in my own lab with colleagues in Belgium has suggested that a new approach of functional contextualism (which differs from current forms of cognitivism — e.g., of memory, attention, and reasoning through logic) may be the solution to progress AI into the generalized form of AGI, where the system learns and understands concepts and how these relate to other concepts (through something called relational frames), and the context in which cues within the environment influence functions and the meaning or uses of such concepts. 


The Shape of Things to Come: GraphQL and the Web of APIs

The inflection point for GraphQL, however, is still a ways off. While there are some major companies using GraphQL, such as Shopify, REST is still the most-used API format in many other companies — including prominent API-based public companies Stripe and Twilio. I asked Jhingran whether he sees those types of companies pivoting to GraphQL over time? He first noted that he doesn’t see GraphQL usurping REST. “We typically find that the REST layer that enterprises have built, [they] have embedded business logic into it. And the GraphQL there, in general, will be a composition layer — as opposed to incorporating deep business logic. And therefore, both will actually co-exist.” However, Jhingran does think that more and more companies will start using GraphQL for their external services, a trend that will happen in stages. “Backend teams are becoming comfortable with GraphQL for the apps that are built by the team,” he said, meaning applications developed internally. Backend developers will take more time to get comfortable using GraphQL APIs from third-party companies, although Jhingran pointed to GitHub and Shopify’s GraphQL APIs as early examples.


Stanford cryptography researchers are building Espresso, a privacy-focused blockchain

Espresso Systems, the company behind the blockchain project, is led by Fisch, chief operating officer Charles Lu and chief scientist Benedikt Bünz, collaborators at Stanford who have each worked on other high-profile web3 projects, including the anonymity-focused Monero blockchain and BitTorrent co-founder Bram Cohen’s Chia. They’ve teamed up with chief strategy officer Jill Gunter, a former crypto investor at Slow Ventures who is the fourth Espresso Systems co-founder, to take their blockchain and associated products to market. To achieve greater throughput, Espresso uses ZK-Rollups, a solution based on zero-knowledge proofs that allow transactions to be processed off-chain. ZK-Rollups consolidate multiple transactions into a single, easily verifiable proof, thus reducing the bandwidth and computational load on the consensus protocol. The method has already gained popularity on the Ethereum blockchain through scaling solution providers like StarkWare and zkSync, according to Fisch. At the core of Espresso’s strategy, though, is a focus on privacy and decentralization. 


5 Tips on Managing a Remote-first Development Team

Working from home makes it harder to remain connected with team members and stakeholders as it reduces not only the frequency of our communication but also its quality. We tend to rely more on email and instant messaging, both purely written media (if you exclude the occasional GIF). How can we, as managers and leaders, ensure that we are fostering healthy and effective communication in our teams and with our stakeholders? Meet 1o1 often and effectively. Unfortunately, one of the first casualties of working remotely tends to be 1o1 meetings. Most managers, particularly early career ones, have difficulties leading good 1o1s. Engineers seem to be particularly averse to bad meetings, seeing them as distractions or as waste of time. This also applies to stakeholders, making addressing concerns, conveying important updates or solving project constraints more difficult. The importance of 1o1 meetings cannot be understated. In its famous Project Oxygen, Google found that managers with higher feedback scores also tended to have more frequent and higher quality 1o1 meetings with their teams.


Blockchain and GDPR (General Data Protection Regulation)

The most obvious method to sidestep the GDPR is simply not to put any individual data on the blockchain relating to any private citizen or resident of the EU. However, this drastically reduces the usefulness of blockchains for any public application, such as health record tracking, social media, reputation reporting systems associated with online sales, and identity systems such as an international passport. The GDPR does not specify if subsequent corrections to the data are acceptable, if the original incorrect data is still present in earlier blocks on the blockchain. ... A further possibility is to ensure that all private data stored on the blockchain is encrypted. In such a situation, the company responsible for data care can provide evidence of the deletion of the data by ensuring that the decryption key is destroyed. Another approach may be to shift the responsibility for protecting the private key to the individual whose data is being stored on the blockchain. 


IT talent: 3 tips to kickstart employee career development

Training typically focuses on hard skills, which is not surprising for a technical role, but it is critical to understand the importance of soft skill training for IT professionals. Soft skills are particularly essential amidst digital transformation efforts, which cannot be done in a silo and require strong communication among many departments. I recently had the opportunity to participate in an eye-opening leadership course on driving strategic growth, which reinforced for me how important it is to take the time to build and strengthen soft skills. This course ran over three months and consisted of lectures, formal learning, and time to practice. Ultimately, we returned to the larger group to share learnings from our practice time. This learning process reminded me how important it is to build learning programs around soft skills. While teaching soft skills can require time and effort, incorporating them into skills training can positively impact your IT employees’ experience and set them on a path of growth and development.


Indian govt kicks off the consultation process to build a fairness assessment framework for AI/ML systems

“We have been studying various aspects of AI/ ML where some standardisation or testing and certification framework could be established. Moreover, we have studied the works of various researchers where biases in various AI/ ML systems deployed by leading corporates and governments are deliberated. Biases in AI/ ML Systems are a real threat, and ensuring fairness in such applications is very important to build public trust in AI/ ML Systems. Accordingly, we have initiated discussions for evolving a framework for fairness certification of such systems,” said Avinash Agarwal, DDG, Telecommunication Engineering Centre. TEC aims to set up standard operating procedures (SOP) to assess the fairness of various AI/ ML systems and create a benchmark. Systems that conform to the specifications will be given a fairness certification, ensuring product credibility and public trust in AI/ ML. “To achieve this, we will follow a consultative process for framing standards, specifications and test schedules. Then, we plan to prepare a draft document based on the various inputs received and release it for public consultations.


SOARs vs. No-Code Security Automation: The Case for Both

This is not to say that you’re required to ditch your SOAR and replace it with a lightweight security automation platform like Torq. Many businesses that have dedicated cybersecurity teams may opt to continue to use their SOARs as the place where they detect and manage the most complex threats, such as active, targeted attacks by professional threat actors. But for managing more mundane risks — like blocking phishing emails, securing sensitive data or detecting malicious users — lightweight no-code security automation is a more practical solution. It’s much easier to deploy, and it empowers all stakeholders to support security operations, even at organizations that have minimal cybersecurity resources. By extension, no-code security automation is the key to thriving in the face of today’s pervasive threats. When you operate in a world that sees 26,000 DDoS attacks and 4,000 ransomware attacks each day, and where threat actors are constantly probing your systems for an open door, you need more agility and automated remediation than a SOAR alone can deliver.


Closing the data quality gap

Data agility has become a central pillar of building a supple business, one that can move, and pivot quickly as new information arises. In the simplest terms, data agility is the distance between the data that informs a decision and the decision itself. This means that poor quality data will lead to poor decision making. Pairing trustworthy contact data with an agile data management programme enables organisations to make their data actionable, allowing for better and faster decisions when pursuing new and existing opportunities. That’s why 94% of business leaders believe having agility in both business and data practices is important in responding to the pandemic. Achieving greater agility requires business leaders to rethink their use of technology and be more open to integrating it into their businesses. Half the human brain is devoted to processing visual images and processes data at 60 bits per second. That might go some way to explain why four out of 10 leaders say they are looking for easy-to-use solutions; in turn this helps enable data and business users alike to visualise, read, write, and argue with data insights.


Digesting Blockchain

The servers are located in different computers around the network, therefore being distributed (peer-to-peer or P2P). Once a transaction is made, the new information is replicated and received by the nodes within the P2P network and added to the corresponding block open at that time. The block will contain the transaction information and each transaction will be assigned a "hash" once it has been validated by the network nodes —the cryptographic hash is like the digital fingerprint of the transaction and is represented by a sequence of numbers. A hash is a function that converts one value into another, and the latter contains a fixed amount of numbers or figures. The information about the transaction recorded in the block can include details regarding who, what, how, how much, or when the transaction happened. The information contained in the block can neither be changed nor its hash. Accordingly, If the information in a block is changed the whole sequence of blocks will become invalid.



Quote for the day:

"The test we must set for ourselves is not to march alone but to march in such a way that others will wish to join us." -- Hubert Humphrey

Daily Tech Digest - March 07, 2022

Graphcore Supercharges IPU With WoW Processor

Graphcore affirmed that its Bow IPU chip delivers 40% higher performance and 16% better power efficiency for real-world applications when compared to previous versions or models. Furthermore, the British semiconductor firm stated that its Bow Pod flagship products could deliver more than 89 petaFLOPS of AI computing. Additionally, the superscale Bow Pod can scale up to 350 petaFLOPS. Graphcore’s plan to develop an ultra-intelligence AI supercomputer is one of the fascinating announcements in the tech world. Graphcore emphasized how “approximately 100 billion neurons and more than 100 trillion parameters in a biological-neural-network system that delivers a level of computing yet to be matched by any silicon computers.” It also mentioned that it is developing an AI computer that will exceed the parametric capacity of the brain. The computer’s name is ‘Good,’ which is named after the pioneer of computer science, Jack Good. The pivotal achievements of Jack Good during the Second World War are worth reading. 


5 Best Practices For Code Review

You must seek advice or help from fellow developers as everyone’s contribution is equally important. Experienced ones can identify the mistakes within a second and rectify them but the young minds come up with more simple ways to implement a task. So, ask your juniors as they have the curiosity to learn more. To make it perfect, they find other ways which will benefit in two ways: a) They’ll get deeper knowledge; b) Solution can be more precise. ... A piece of code that does a single task that can be called whenever required. Avoid repetition of codes. Check if you’ve to repeat code for different tasks, again and again, so there you can use these functions to reduce the repeatability of code. This process of using functions maintains the codebase. For example, if you’re building a website. Several components are made in which basic functionalities are defined. If a block of code is being repeated so many times, copy that block of code or function to a file that can be invoked (reused) wherever and whenever required. This also reduces the complexity level and lengthiness of the codebase.


Rush to cloud computing is outpacing organizations' ability to adapt

Educating the business is a vital piece of an effective strategy. The Harvard Business Review report describes how Chegg, an educational technology and information publisher, has been rearchitecting its cloud approach over the past year to create smaller, more flexible cloud accounts for use by its engineering teams. "We've been in cloud for so long, we've learned a lot of what's working and what isn't working," John Heasman, chief information security officer, is quoted in the study. "We ended up in a position where we needed to take a step back and look at our architecture to align with best practices in cloud infrastructure and improve our processes overall." Heasman and his team concentrated on educating the company's leaders on the ways its cloud strategy will result in new services. "It's not just a case of saying, 'Here's a new account. It's yours,'" Heasman says. "It required a lot of planning to ensure the right level of oversight while still enabling our team to get the full benefit of cloud-native technology."


How Attack Surface Management Preempts Cyberattacks

ASM is a technology that either mines Internet datasets and certificate databases or emulates attackers running reconnaissance techniques. Both approaches aim at performing a comprehensive analysis of your organization's assets uncovered during the discovery process. Both approaches include scanning your domains, sub-domains, IPs, ports, shadow IT, etc., for internet-facing assets before analyzing them to detect vulnerabilities and security gaps. Advanced ASM includes actionable mitigation recommendations for each uncovered security gap, recommendations ranging from cleaning up unused and unnecessary assets to reduce the attack surface to warning individuals that their email address is readily available and might be leveraged for phishing attacks. ASM includes reporting on Open-Source Intelligence (OSINT) that could be used in a social engineering attack or a phishing campaign, such as personal information publicly available on social media or even on material such as videos, webinars, public speeches, and conferences.


How artificial intelligence is influencing the arms race in cybersecurity

There are two main ways A.I. is bolstering cybersecurity. First, A.I. can help automate many tasks that a human analyst would often handle manually. These include automatically detecting unknown workstations, servers, code repositories, and other hardware and software on a network. It can also determine how best to allocate security defenses. These are data-intensive tasks, and A.I. has the potential to sift through terabytes of data much more efficiently and effectively than a human could ever do. Second, A.I. can help detect patterns within large quantities of data that human analysts can’t see. For example, A.I. could detect the key linguistic patterns of hackers posting emerging threats on the dark web and alert analysts. More specifically, A.I.-enabled analytics can help discern the jargon and code words hackers develop to refer to their new tools, techniques, and procedures. One example is using the name Mirai to mean botnet. Hackers developed the term to hide the botnet topic from law enforcement and cyberthreat intelligence professionals.


Three reasons why your API security is failing

Many organisations have adopted DevOps practices, realising efficiencies in the development cycle. It’s natural that they would want to remove similar barriers with security. In the recent Salt Labs State of API Security report, 40% of respondents said developers or DevOps teams hold primary responsibility for securing APIs, but 95% of respondents experienced an API security incident in the past year, highlighting that the burden cannot fall solely on developer shoulders. Developers make applications work, but attackers make them perform in unintended ways. It’s difficult for developers to shift into an attackers mindset. Despite the methods available to identify potential vulnerabilities, it’s rare that all aspects of code are tested. Furthermore, as it is so difficult to keep up with today’s ultra-fast code, developers typically only test primary apps or specific areas of functionality and most scanning tools depend on best practices and signatures to identify vulnerabilities. Yet, these approaches are ineffective at identifying unique logic vulnerabilities.


Kremlin’s Aggression Divides Digital Ecosystems Along Tech Trenches

Sanctions to restrict international financial transactions and other commerce with Russia have already been put to work. Now that country faces the loss of certain technology services and resources as more tech companies seek to decouple themselves from the aggressor state. As the grim war on the ground wages on, new lines of demarcation emerge in response across the digital world. The future of greater connectivity may look drastically different than expected, says Raj Shah, head of tech, media, and telecoms for North America at digital consulting firm Publicis Sapient. The first globalization was supposed be a singular, interconnected world, he says, but China emerged as a challenger to the United States and other economically allied nation states. Now Russia’s actions may further fracture the dynamics of the digital landscape. “There does appear to be this fragmentation that’s going to start to happen,” Shah says. ... There may be some interchanges of information in buffer zones, he says, where some technology and commerce from opposing geopolitical spheres can intersect, but there will also be cordoned-off spaces.


Update: Samsung Confirms Source Code Stolen in Breach

The ransomware group released a teaser on its Telegram channel before posting the data saying, “get ready, Samsung data coming today.” Then the gang posted confidential Samsung source code data in a compressed file, available on torrent and split it into three parts, which includes almost 190GB of data. Lapsus$ published a description of the leak, which it says includes: source code for every Trusted Applet installed on all Samsung device's TrustZone with specific code for every type of TEE OS (QSEE, TEEGris, etc). Trusted Applets are used for sensitive operations such as full access control and encryption. The group says it also includes DRM modules and Keymaster/Gatekeeper. Algorithms for all biometric unlock operations include: “Source code that communicates directly with sensor (down to the lowest level), we're talking individual RX/TX bit streams here and boot loader source code for all recent Samsung devices, including Knox data and code for authentication," the gang says.


Three unusual questions to make job candidates think

A job interview is intended to be a kind of crystal ball, one that gives the employer a sense of what somebody is going to be like not only the first day but also after six months, when the honeymoon period is over and they’re facing a few challenges. I can’t think of a better question for ascertaining how an employee will behave over time. Of course, the thought of asking such a personal question may understandably make interviewers uncomfortable (although I’ve been assured by many HR professionals that the question is fair game). If so, simply focus on the positive by asking, “What qualities do you like the most in your parents?” Here’s the question that Clara Lippert Glenn, then CEO of the energy-industry training company Oxford Princeton Programme, told me she poses to candidates: “If you woke up tomorrow morning, and there were no humans left on the earth—just animals—what kind of animal are you?” I’ve used this question many times as an icebreaker with small groups, and added a second beat to the question, which is to ask people to explain why they chose that particular animal.


Agile transformation: 3 ways to achieve success

Too many organizations left much of their knowledge untapped due to outdated and bottleneck processes. In fact, 80 percent of undocumented human knowledge goes untapped. By mining every organizational knowledge resource, employees can work together beyond the scope of their own team, personal network, and organizational silos, eliminating remote worker disconnects, data gaps, and inaccurate or outdated information. Moreover, employees become empowered to build on their knowledge set, increase productivity, enjoy a greater stake in their work, and reach their potential without organizational hierarchies, geography, cultures, or languages impeding their work. The result is higher cultural inclusivity and increased organizational agility. Investing in tools, procedures, and cultural shifts that close the knowledge gaps, break down silos, and foster greater collaboration provides a return on investment in agility, retention, and productivity. Additionally, higher concentrations of knowledge collaboration unleash enormous amounts of tacit knowledge that would otherwise remain hidden and unutilized.



Quote for the day:

"Take time to deliberate; but when the time for action arrives, stop thinking and go in." -- Andrew Jackson

Daily Tech Digest - March 06, 2022

Decentralized identity using blockchain

Let’s assume an online shopping scenario where the required data will transit from the wallet associated with the decentralized identity. The wallet in this scenario contains the verified identity, address, and financial data. The users share identity data to log in with the website by submitting the required information from the identity wallet. They are authenticated with the website without sharing the actual data. The same scenario applies to the checkout process; a user can place an order with the address and payment source already verified in his identity wallet. Consequently, a user can go through a smooth and secure online shopping experience without sharing an address or financial data with an ecommerce website owner. ... Blockchain technology uses a consensus approach to prove the data authenticity through various nodes and acts as the source of trust to verify user identity. Along with the data, each block also contains a hash that changes if someone tempers the data. These blocks are a highly-encrypted list of transactions or entries shared across all the nodes distributed throughout the network.


Breach Notification: Poor Transparency Complicates Response

Unfortunately, data breach experts continue to see increasing transparency shortfalls, both from organizations that fall victim and from regulators. In 2020, for example, 209 consumer breach notifications lacked important details, while in 2021, 607 breaches lacked such details. So says the Identity Theft Resource Center, a nonprofit organization based in San Diego, California, that provides no-cost assistance to U.S. identity theft victims to help resolve their cases (see: Data Breach Trends: Global Count of Known Victims Increases). "The lack of actionable information in breach notices prevents consumers from effectively judging the risks they face of identity misuse and taking the appropriate actions to protect themselves," ITRC says in its latest Annual Data Breach Report, looking at 2021 trends. "A decrease in timely notices posted by states, including one state that updated breach notices in December 2021 for the first time since the fall of 2020, also prevents consumers from taking action to protect themselves and organizations that assist identity crime victims from offering timely, effective advice."


Entrepreneurship for Engineers: How to Build a Community

You probably already know that understanding and being able to articulate your product’s value proposition is critical to successful sales and marketing — but your community needs to add value, too, above and beyond the value that the product/project provides. “No one wakes up in the morning and thinks ‘I’m going to go and answer questions on the internet,” Bacon said. People need to get something out of participating in the community that they can’t get anywhere else. “People love the community aspects,” said Ketan Umare, co-founder and CEO of Union.ai, the company behind Flyte, an open source workflow automation platform for data and machine learning processes, and his experience building a community with a value proposition above and beyond the project’s value. “We guarantee you that in the community, there is somebody to listen to your problems,” Umare said. “It creates this feeling that you are not alone.”


What SREs Can Learn From Capt. Sully: When To Follow Playbooks

What’s interesting about Sully’s story is that he didn’t do exactly what pilots (or engineers) are trained to do. He didn’t stick completely to the playbook that a pilot is supposed to follow during engine failure, which stipulates that the plane should land at the nearest airport. Instead, he made a decision to crash-land in the Hudson River. The fact that Sully did this without any loss of human life turned him into a hero. In fact, Sully the movie almost villainizes the National Transportation Safety Board (NTSB) for what the film presents as an unfair investigation of Sully for not sticking to the playbook. Yet, as the podcasters noted, the difference between heroism and villanism for Sully may just have boiled down to luck. They pointed out that in similar incidents – like the Costa Concordia sinking in 2012 – in which staff deviated from playbooks, they ended up facing stiff penalties. In the Costa Concordia case, the captain of the boat was placed in jail – despite the fact that his decision not to stick rigidly to the playbook most likely reduced the total loss of human life.


The truth about VDI and cloud computing

Performance is the core problem. Not all home-based Internet connections support high speeds and low latency. Indeed, even if you pay for the faster stuff, a few days of detailed monitoring will show that latency and speed are pretty bursty overall. VDI, depending on what you’re leveraging, indeed keeps data and applications centrally located and thus hopefully secure. But both application images and data must be constantly transmitted to the employees’ devices and interactions transmitted back to the virtual servers. They are very chatty. This is unlike applications that run locally and have data stored locally, where the response is nearly instantaneous. Most of us are used to this kind of performance. Latency, even if it’s not noticeable by most remote workers, can add up to productivity losses that run into many millions of dollars a year. Many of the savvier remote workers have worked around the performance issues by moving some of the data to local storage on their devices (such as with email), thus causing a potential security problem if the device is hacked or stolen.


Ukraine: How to protect yourself against cyberattacks

Experts say they are currently more concerned with institutional rather than personal cyber hacks. But attacks on individual accounts owned by private citizens, who work for institutions that handle sensitive information, are still a risk. "People who are not wary are often the weakest link and the foot in the door for cybercriminals looking to stage a larger attack on critical infrastructure," Rachel Schutte, an IT and cybersecurity manager based in Germany, told DW. This was the case for European government personnel involved in assisting refugees fleeing Ukraine. They received phishing emails — or messages aimed at collecting sensitive information — from a Ukrainian armed service member’s compromised account, she said. In response to increased instances of cyberattacks aimed at employees of high-profile organizations, Deutsche Welle has also asked employees to ramp up security on personal social media accounts. ... Cloud-based services distribute distinct functions across data centers in multiple locations, fueling a race towards interconnected networks. 


Finance firms scrape alternative data from unexpected sources

In light of the "Great Resignation" and unprecedented job mobility in part sparked by the pandemic, such data about job happiness is "top of mind for investors today," Lopata said. Another timely use for alternative data is tracking how inflation in the U.S. is disrupting markets. Thinknum is following used car sales on CarMax and Carvana, two of the big auto sales apps. "We're tracking all that data in real time down to a VIN number, so that allows you to understand whether prices are peaking," Lopata said. "Beyond just tracking the peaks … we're tracking when the peak ends." "We're able to identify that in January '22, we finally started to see some decrease in pricing," she added. Other current market trends for which Thinknum is digging up alternative data include changes in the food delivery services business and cryptocurrency price fluctuations, where the vendor has discovered that GitHub, the provider of internet hosting for software development, is a prime source of data.


US Officials Push Collaboration, AML Controls for Crypto

According to Conklin, the Treasury Department has for a decade targeted the assets of Russian elites - dating back to the country's first invasion of Crimea in 2014. "So we do know a little bit about how this regime likes to evade sanctions and move money, and we have a significant toolkit at our disposal now to tackle that," he said. "The regime does like to layer its assets and move money. They have a long and extensive playbook to launder money, and at the center of their playbook is their web of international corporate registration and the use of foreign companies and foreign persons. They're also really adept at conversion to other assets, including gold and foreign currencies." And so, asked whether crypto will be a part of its workaround, Conklin said: "Certainly, there's going to be an element. That's part of the playbook, but it frankly isn't at the top of their list." He also referenced Treasury's sanctioning of the Russian crypto exchange Suex in September 2021 as an example of "how sanctions can work in the crypto ecosystem"


Gartner: Public sector must target disjointed IT strategy

Mickoleit recommended that public sector IT chiefs “zoom out” to enable them to look at how technology investments can be aligned with policy objectives. As an example of joining up IT with policy, he said it is impossible to provide high-quality public sector services without the concept of digital identity, which needs to link across different tech infrastructure and public sector bodies. Another aspect of the pandemic was that having “good enough” processes is not sufficient, said Mickoleit. “Just working isn’t enough. There were huge scaling issues, families and businesses in need.” He warned that such a situation is not sustainable when there is a disruption. “There is a need for efficiencies in government,” he added. This means IT leaders need to focus on reducing the number of process steps to support case work and deliver a service to a citizen, said Mickoleit. “There is an ideal opportunity to combine AI and automation for better support,” he pointed out.


How to Become a Data Governance Lead

A significant problem facing businesses implementing a Data Governance program is the realization that raw data is often not analysis-ready. The data may be badly organized, unstructured, or has been stored in separate databases. The data has to be cleaned and standardized before the Data Governance program can move forward. Developing a Data Governance program might require a fair amount of manual labor, but after the data has been standardized, incoming data would be sent automatically to the appropriate location, and in the correct format. Data silos are a slightly different problem for Data Governance programs. Data can be stored in silos and treated as though certain teams or individuals own it — and they sometimes don’t like to share. Additionally, different departments may use entirely different systems, making standardization especially difficult. These same departments may have no real understanding of their data’s value. Data Governance will support a framework allowing access to their data, breaking down the silos.



Quote for the day:

"A leader's dynamic does not come from special powers. It comes from a strong belief in a purpose and a willingness to express that conviction." -- Kouzes & Posner

Daily Tech Digest - March 05, 2022

UK data watchdog urges vigilance amid heightened cyber threat

Asked about the potential for a Russia-Ukraine cyber conflict spreading to the UK, Edwards said: “We have picked up on that heightened threat environment and we think it’s really important to take the opportunity to remind businesses of the importance of security over the data that they hold. This is a different era from blacking out the windows and keeping the lights off. The threats are going to come in through your inbox.” Edwards said that outside the Ukraine conflict and the warnings it had brought of a heightened security threat, the Information Commissioner’s Office had seen a “steady and significant” increase in cyber-attacks against UK businesses over the past two years. Between July and December last year the ICO recorded 1,345 “cybersecurity incidents”, including ransomware attacks, where assailants demand payment in cryptocurrency to decrypt a target’s computers, and phishing attacks, where the victim is tricked, often via email, into downloading malware or handing over their login details. 


10 enterprise AI trends for 2022

After companies are successful at initial proofs of concept, they often build AI centers of excellence to operationalize the technology and build talent, expertise, and best practices. But once a company reaches a level of critical mass, then it makes sense to break up some of these centers of excellence and federate AI, moving experts directly into the business units where they are needed most. “For those companies that are less mature, there is value in having a center of excellence that is housing talent and learning across the institution,” says McKinsey’s Singla. “Without that, companies usually don’t have the ability to scale. Talented people want to be with other like-minded people. And less experienced people benefit from being in a center of excellence because they can grow or learn.” Distributing them too early would dilute their impact and reduce a company’s ability to iterate and duplicate successful projects across multiple business lines. “But as you get to a layer of maturity and scale, longer-term, the benefit of technologists having both a deep AI expertise and domain expertise is a real home run,” he says.

We Aren't Sure If (Or When) Artificial Intelligence Will Surpass the Human Mind

If and when AI reaches the point where it can continually improve itself, the fate of our species could depend on the actions of this superintelligent machine, warns Nick Bostrom, a University of Oxford philosopher, in his book Superintelligence: Paths, Dangers, Strategies. Yet that fate might not necessarily be a dismal one. The experts also point out that superintelligent AI could offer a solution to many of our problems. If we can’t figure out how to tackle climate change, eradicate poverty and ensure world peace, perhaps AI can. “This remarkable technology has the potential to help everybody live healthy, wealthy lives so humanity can flourish like never before,” says Tegmark, who is also the founder of the Future of Life Institute, an organization that aims to ensure these positive outcomes. Yet, he adds, it “might wipe out humanity if its goals aren’t aligned with ours.” Or as Bostrom put it in Superintelligence, when it comes to confronting an intelligence explosion, “We humans are like small children playing with a bomb.”


Researchers Devise Attack for Stealing Data During Homomorphic Encryption

The attack technique that the researchers at NC State developed involves a vulnerability in a Microsoft implementation of fully homomorphic encryption called Microsoft Simple Encrypted Arithmetic Library (SEAL). Microsoft SEAL is a collection of encryption libraries for performing computing operations on encrypted data. The vulnerability, which the researchers have described as a "power-based side-channel leakage" is present in the SEAL homomorphic encryption library through version 3.6 of the technology, according to the researchers. It enables attackers to use a single power measurement from the device doing the encryption operations to extract data in plaintext while the data is being homomorphically encrypted. The vulnerability allows attackers to listen to the machine doing the encryption and infer if a 0 bit is being processed or a 1 bit, Aysu says. "It's a few lines in the software code that give out the data being executed on the device," he says. "This information allows us to use some fancy equations and figure out the secret messages being encrypted in a homomorphic encryption scheme."


How Kubernetes and Database Operators Drive the Data Revolution

Big, centralized application backends just could not provide the flexibility required to scale from thousands to millions of requests per second. Most of us probably can remember at least a couple of cases when “monolithic” web applications were experiencing severe performance issues after going viral. The solution to this problem came from adapting an approach where organizations split these monoliths into smaller “micro” services running on docker containers that can be horizontally scaled both independently of each other and much quicker than monoliths. With each microservice adding to the demand on development operations, this strategy, however, wouldn’t be so successful without container orchestration frameworks like Kubernetes. Introduced publicly in 2014, Kubernetes, formerly known inside Google as Borg, quickly proved itself as a top choice for automating deployment workflows and today is one of the industry standards for modern development operations. Also, being an open source, cloud native component, Kubernetes continues to evolve and improve. 


The best is yet to come: What’s next for blockchain and the creator economy

The rise in creative energy has inspired the developer community as well. New niche streaming platforms have grown up, helped by the emergence of low-cost decentralized infrastructure that allows application builders to encode video, store data and handle identity without having to pay expensive centralized cloud providers for such services. These centralized providers will increasingly find themselves on the defensive. Two attention-grabbing incidents in 2021 are illustrative: Hackers attacked Twitch and released private information about its code and its users to the world. And, Facebook suffered colossal reputational damage from a lengthy outage and whistleblower claims that its management has repeatedly chosen to prioritize profit over safety. Big Tech’s woes and pandemic-related restrictions have sped up fundamental changes already underway in how the world produces, consumes and uses video content — changes likely to propel growth in the creator economy well into the future. 


Harvard census identifies most commonly used open source packages

The census is broken down into eight ranked lists. Four include version numbers and four are version agnostic. Packages that use the default JavaScript npm package manager have been split out from non-npm packages. There are also separate lists for packages that are directly called by developers versus those that are indirectly called as dependencies, bringing attention to the kinds of deeper dependencies that are more difficult for developers to observe within their environments. These lists “represent our best estimate of which FOSS packages are the most widely used by different applications, given the limits of time and the broad, but not exhaustive, data we have aggregated,” the report notes. While the census does not attempt to identify the riskiest OSS projects, it does note that “measuring risk profiles is a separable task, and it’s easier to do it once the most widely used software is identified.” That work will require cross-industry effort and will depend on the individual risk profile of the consuming organization.


Multi-Chain Evolution: Why Blockchain’s Future is Interoperable

One project with interoperability at its heart is Spherium. Not only has the startup developed a cross-chain bridge, but its incubation program HyperLaunch facilitates the seamless entry of innovative blockchain projects into the cross-chain ecosystem. Spherium also has its own cross-chain DEX, which supports trading among tokens between EVM and Non-EVM compatible chains. To date, Spherium has partnered with leading NFT, DeFi, P2E, and general blockchain projects. The alliance seeks to “expand the multi-chain experience for creators and buyers to bridge major blockchains and add certain defi functionalities into their platform.” Through HyperLaunch, Spherium offers projects the opportunity to integrate a bridge solution into their core functionality and enable the deployment of their native tokens on different networks. With the audited Spherium bridge, token swaps can take less than one minute currently and is free to use for projects in the HyperLaunch program. Geared towards defi, NFT and meta-world platforms, the incubation program offers more than just bridge access; staking and dual-farming solutions are also provided, along with extensive technical support.


Finance firms scrape alternative data from unexpected sources

In light of the "Great Resignation" and unprecedented job mobility in part sparked by the pandemic, such data about job happiness is "top of mind for investors today," Lopata said. Another timely use for alternative data is tracking how inflation in the U.S. is disrupting markets. Thinknum is following used car sales on CarMax and Carvana, two of the big auto sales apps. "We're tracking all that data in real time down to a VIN number, so that allows you to understand whether prices are peaking," Lopata said. "Beyond just tracking the peaks … we're tracking when the peak ends." "We're able to identify that in January '22, we finally started to see some decrease in pricing," she added. Other current market trends for which Thinknum is digging up alternative data include changes in the food delivery services business and cryptocurrency price fluctuations, where the vendor has discovered that GitHub, the provider of internet hosting for software development, is a prime source of data. "We've been looking at where we can find a signal before it hits the market," Lopata said.


How AI and Analytics Help Organizations Deliver a Better Customer Experience

Technology that allows companies to analyze and deliver data where it’s needed and at the right time is crucial to producing a better customer experience. To achieve this, companies need a data architecture that provides a highly granular view of their customers. The architecture must be flexible and adaptable, so new data can be incorporated and changes can be made without causing major technological upheavals. The problems we face chasing omniscience about customers are constantly evolving. The tools and solutions evolve to keep pace. The data architecture we implement must handle these changes in a forgiving manner. Additionally, the architecture must make data available to developers, analysts, and other staff to use whenever they need it in a way that protects the integrity and security of the individual data event. A data fabric brings together data of all forms — from the edge to the cloud — and provides services for discovery, governance, quality, and transformation. This architecture truly puts data to use at scale to improve customer experiences.



Quote for the day:

"Don't let a bad day make you feel like you have a bad life." -- Joubert Botha

Daily Tech Digest - March 04, 2022

How the C-Suite can set the organisation up for metaverse success

In order to reap the benefits that the metaverse offers, the most important thing leaders need to do currently is to build capability and thinking within their organisation. We are currently in the exploratory phase of the metaverse, so their needs to be an encouragement for employees to develop a greater awareness and adopt a more inquisitive mindset. Hiring new talent is going to be imperative to acquire new skills and expand that knowledge across teams to enable an incubator mentality. The metaverse will allow employees to solve old problems in new ways, but leaders will need to give creatives the space and independence to trial different ideas in order to find the best one that suits the business strategy and generate value. To strike this balance, I would recommend creating a separate team outside the IT department, yet within the matrix of the business. This team doesn’t have to be in-house – it could be outsourced. But it must be plugged into your marketing, sales and product development teams so that they have an understanding of the evolving business strategy.


All You Need to Know About User Session Security

Session security is an important consideration in the design of any system that requires communication between a server and a client. Improper security can lead to user accounts being vulnerable to unauthorized access. OWASP (Open Web Application Security Project  —  leading authority for security) considers the improper implementation of authorization/authentication as the second biggest risk to application security. Several notable hacks illustrate this point: The Docker hub database hack earlier this year resulted in stolen Github access tokens; Gitlab had a vulnerability where all its user’s auth tokens were exposed in the URLs, had no expiry time, and were susceptible to brute force attacks due to their short length; A software bug made it possible to steal access tokens  —  affecting 90 million Facebook accounts. It is tricky, time-consuming, and expensive to correctly implement user session management. According to an a16z operating partner (top tier VC) and former Box CSO, "authentication and authorization is the number one spending cost for organizations when it comes to their security budget."


Edgeless Systems Brings Confidential Computing to Kubernetes

Edgeless argues that by using Constellation to leverage confidential computing you can isolate and runtime-encrypt entire Kubernetes deployments. This means, the company states, Constellation enables you to use the public cloud like your private cloud. “Confidential Computing will usher in a new era of securing data in the cloud. With our unique expertise, we are making this new technology accessible to enterprises at scale,” said Felix Schuster, a confidential computing pioneer and Edgeless Systems co-founder and CEO. “Processing data that is always encrypted — not only at rest and in transit but also while in use — is a difficult task.” But Constellation makes it happen. This addresses the old, familiar worry of “Is your data actually safe on a third-party public cloud.” Edgeless claims that it is now since your data is securely encrypted within the cloud’s RAM. Now, as you’d guess, securing and using encrypted data isn’t easy. Constellation addresses this by taking care of such complexities as the verification or attestation of confidential virtual machines (VMs) and workloads, secure connections, key management, and data encryption.

 

Digital transformation: 3 IT culture traps that can hold you back

To accelerate your organization’s digital transformation, first ensure that your business and IT teams have open conversations about business issues and how IT can solve them. If an IT team feels that it’s being dumped on rather than treated as a strategic partner, an “us vs. them” culture can easily develop. The inverse is also true: IT teams cannot simply drop new technology solutions on unsuspecting business teams, believing they’re solving issues that the latter may not even realize exist. Teams should also have clearly defined roles, and they need to align on a common language. For instance, terms such as “production” and “test” mean different things for each team. When both teams try to understand each other, they can find synergy – which is the first step toward success (and creating robust lines of communication should be priority one). Achieving synergy shows that digital transformation is not merely a process of bringing in new technology to create change; it’s about redefining how the organization operates and communicates.


Decarbonising datacentres: Turning the hot air about heat reuse into real-life use cases

As with any new technology, there are a number of challenges that need to be overcome first to take full advantage of the opportunity and promise of heat reuse. One of them is the proximity to the heat demand. For example, the closer a datacentre is to homes or businesses in need of heat, the better. This could be achieved with government support and policy-making in the UK, as many European Union member countries have already done. Where datacentres cannot be situated near existing heat demand, applications such as greenhouses or sustainable farms could be purposely located there to make use of this low-carbon, low-cost energy, which is a path Norwegian colocation firm Green Mountain is pursuing. The captured heat also has to have a price in order to secure the return on investment, and to let the competition drive the change. There are a number of facilities around the world already demonstrating the feasibility of installation, environmental benefits and enormous savings resulting from recovering waste heat.


Can we and should we have fully open APIs?

To consider the limitations of open APIs, it is essential to also take into account the issues with API itself. According to Salt Edge’s survey of over 2,000 API initiations with banks from 31 European countries, 38% of bank APIs don’t meet EU or UK regulatory standards. It was also revealed that 43% of banks did not support automated registrations to access the relevant APIs, 22% had faulty documentation, and 28% had downtimes during the integration. A closed API is not accessible openly and typically resides in highly secure settings. But with Open APIs, a recent report from Transparency Market Research showed that, in terms of security, uncertainty and vulnerability of the third-party apps is one of the major restraining factors that affect the market. According to experts, a good open API requires basic considerations like the choice of selecting data format (JSON, XML, Text, VML, etc.), the protocol (HTTP, HTTPS), and the version of API. It is also important to consider security, be it API authentication or HTTPS. 


How to Navigate the Complexity of SaaS Management

SaaS offers tremendous value to organizations, but businesses need an easier path to rein in SaaS complexity. What’s the way forward? Adopting a comprehensive approach to SaaS management that solves IT, security, risk and finance teams’ challenges by giving them a single source of truth into every SaaS application is the answer. A modern approach to SaaS management enables stakeholders to: Discover both known and unknown SaaS applications, providing complete and actionable visibility into all data types and interconnectivity flows; Uncover and mitigate various security risks that put sensitive customer and business data at risk; Deliver the insights on user access and app utilization needed for better IT management and cost optimization across all SaaS apps; Streamline SaaS compliance reporting A comprehensive SaaS management solution that addresses risk management and business value for SaaS in one place for all stakeholders is imperative for business success.


Using Emergence and Scaffolding to Address Complexity in Product Development

Some would argue that known knowns should be simple to deal with and easily elaborate but even here caution is needed. I can recall working on the development of a new card settlement system where we needed to be able to deal with blacklisted cards. The assumption was that a card would be either black listed or not but we were advised that the current system could return ‘yes’, ‘no’, or a ‘maybe’ and no one could explain the latter. We had made the mistake of assuming this was clear and obvious but it really was a complicated issue and resolution was both time-consuming and costly. We have a large number of experiences addressing the second type of need: known unknowns and you could argue that agile practises accommodate articulation of these needs and related practises such as innovation games help here. This is broadly the case and iterative development is helpful as it allows us to articulate these elements and to incorporate them.


Are Blockchains Databases?

We’re going to see database and blockchain technologies continue to grow towards one another. On the blockchain side, the reasons are obvious: database feature sets and the operational and cost expectations of businesses for these use cases have emerged over many decades for good reason, and those reasons aren’t going away. Commercial blockchain solutions will have to meet those business expectations or die trying. The pressure on databases to adopt blockchain capabilities are a little more subtle. A bit of this is healthy competition — what customer doesn’t want to be able to time travel through older versions of data, sleep better knowing their data is tamperproof, or eliminate the overhead of application logs by integrating lineage directly into the data model itself? But beyond that, database users are also operating in increasingly regulated environments: GDPR and CCPA/Prop 24 have made it necessary to control PII not just within a company, but across companies. Financial and other regulations require tracking anything that involves money or credit more carefully than ever before, including being able to audit what was changed, when, and by whom. 


Regulator Announces Border Gateway Protocol Security Review

So it's welcome news that after security experts for years have been warning that BGP remains poorly secured and poses a risk, the U.S. government is finally taking a closer look. The Federal Communications Commission on Monday announced that it has begun a security review of BGP and will soon open a 30-day period for comment. The timing of the FCC's call for comments is auspicious, not least from a national security perspective, given that Russia reportedly used BGP hijacking against Ukraine last week. "There's nothing like a war to get them listening," Woodward says. Ukraine's computer emergency response team, CERT-UA, reported seeing BGP hijacking attacks less than 24 hours before Russian troops invaded on Feb. 24, at the same time Ukrainian government and banking systems were being hit by distributed denial-of-service attacks. "Around the same time of the DDoS attacks … CERT-UA asserted that there was a BGP hijacking attack against a Ukrainian bank," Cisco Talos says. 



Quote for the day:

"Effective team leaders realize they neither know all the answers, nor can they succeed without the other members of the team." -- Katzenbach & Smith