Daily Tech Digest - February 12, 2022

Speeding Time to Value: The ‘Just in Time’ Data Analytics Stack

JIT analytics signals an end to the era of data consolidation in which data management was based on organizations moving data into warehouses, data lakes and data lakehouses. By leveraging a variety of approaches, including data materialization, query tool abstractions, and data virtualization, this new analytics era obsoletes the need to move data to store it in one place for analysis. JIT represents a substantial evolution in fundamental data strategy that not only results in better, timelier analytics, but protects the enterprise from numerous shortfalls. Costs (related to data pipelines, manual code generation, etc.) are reduced, boosting resource conservation. Moreover, organizations decrease a considerable amount of risk related to regulations and data privacy by not constantly copying data from one setting to another, which potentially exposes it to pricy noncompliance penalties. ... Best of all, the business logic — schema, definitions, end-user understanding of data’s relation to business objectives — is no longer locked in silos but is transported to the data layer for greater transparency and usefulness.


Why DeFi could signal the beginning of the end of traditional Fintech

This is to be expected with current processes that require analogue or legacy methods of ID verification. This is where the interoperability of the likes of blockchain come in; providing unimpeachable data to prove identity/provenance, and in doing so both removing the need for slow processes and accelerating the rate of transactions. Interoperability is where identifies are portable across ecosystems. For instance, in gaming, progress and identifies can move across different gaming dApps. In finance, that would be a single, immutable identity used for everything, with less requirement to prove a customer is who they say they are as their interoperable identity is trusted by all providers. Lending, payments, proof of identity, trading; all will become decentralised, and much faster. Then there’s NFTs. Transforming ownership of assets with irrefutable proof, as well as ushering in a new asset class – that of digital. For example, we could well see the rise of tokenised invoices. Invoice beneficiaries could sell parts or whole of the invoice on an NFT marketplace within seconds. 


Cybercriminals Swarm Windows Utility Regsvr32 to Spread Malware

“Threat actors can use Regsvr32 for loading COM scriptlets to execute DLLs,” they explained in a Wednesday writeup. “This method does not make changes to the Registry as the COM object is not actually registered, but [rather] is executed. This technique [allows] threat actors to bypass application whitelisting during the execution phase of the attack kill chain.” ... As a class, .OCX files contain ActiveX controls, which are code blocks that Microsoft developed to enable applications to perform specific functions, such as displaying a calendar. “The Uptycs Threat Research team has observed more than 500+ malware samples using Regsvr32.exe to register [malicious] .OCX files,” researchers warned. “During our analysis of these malware samples, we have identified that some of the malware samples belonged to Qbot and Lokibot attempting to execute .OCX files…97 percent of these samples belonged to malicious Microsoft Office documents such as Excel spreadsheet files.” Most of the Microsoft Excel files observed in the attacks carry the .XLSM or .XLSB suffixes, they added, which are types that contain embedded macros.


Sardine’s algorithm helps crypto and fintech companies detect fraud

Sardine, a compliance platform for fintech companies, primarily serves neobanks, NFT marketplaces, crypto exchanges and crypto on-ramps. Its 50+ customer roster includes Brex, FTX, Luno and Moonpay, the company says. “The use case for all our customers is essentially that they’re all using us to prevent fraud when money is being loaded into a wallet,” CEO and co-founder Soups Ranjan told TechCrunch in an interview. When customers move money to their wallet via credit card, debit card or ACH transfer, Sardine assigns a risk score to the card or bank account being used through an algorithm and assumes fraud liability for the transaction. The company has long offered these two features, but it announced today that it is unveiling instant ACH transfer capabilities, allowing customers to bypass the traditional three- to seven-day waiting period to access their funds. Sardine will do this by pre-funding the consumer’s crypto purchase in their wallet and taking on the fraud, regulatory and compliance risk associated with the waiting period.


Is DeFi signalling the great financial evolution?

DeFi presents a revolutionary new frontier for opening fresh opportunities in improving the global finance markets. From a utility perspective, its technology alone presents improved speed and security to facilitate financial transactions — internationally. Moreover, this use case alone is a highly-developed feature that sees constant pursuit of uptake and improvement by popular cryptocurrency projects such as Stellar (XLMUSD). Additionally, DeFi sets a fascinating new precedent for enabling access to financial services. While bank accounts are the expected norm within most developed societies, the World Bank estimates that up to 1.7 billion people in the world do not have access to an account of their own. As a result, the emergent term within discussions around DeFi sees its examples championed by those in favour of enabling access among users who don’t transact via traditional bank accounts, now commonly referred to as the ‘unbanked’. As the concept continues to evolve, the question of how DeFi can be regulated will no doubt continue to be a pressing topic globally.


Cyprus: The Concept Of Control In DeFi Arrangements

With regard to DeFi arrangements specifically, the FATF Guidelines recognize that a DeFi application, that is, the underlying software, is not a CASP in accordance with FATF standards, as these do not apply to software or technology. Despite this, it is possible for persons such as creators, owners and operators of such applications to qualify as a CASP, if they are found to be providing or actively facilitating CASP services. Their classification or not as the obliged entity for AML/CFT law purposes will be based on whether they maintain control or sufficient influence over the arrangement. Owners or operators are often distinguished by their relationship to the activities undertaken by the arrangement. Control or sufficient influence may be exercised in a number of ways, for example, setting parameters, holding an administrative key, retaining access to the platform or collecting fees or realizing profit. It is emphasized that a person may be found to be exercising sufficient control or influence over a DeFi arrangement even in instances where other parties are involved in the provision of the service, or where part of the process is automated.


The AI edge chip market is on fire, kindled by 'staggering' VC funding

Edge AI has become a blanket term that refers mostly to everything that is not in a data center, though it may include servers on the fringes of data centers. It ranges from smartphones to embedded devices that suck micro-watts of power using the TinyML framework for mobile AI from Google. The middle part of that range, where chips consume from a few watts of power up to 75 watts, is an especially crowded part of the market, said Demler, usually in the form of a pluggable PCIe or M.2 card. "PCIe cards are the hot segment of the market, for AI for industrial, for robotics, for traffic monitoring," he explained. "You've seen companies such as Blaize, FlexLogic -- lots of these companies are going after that segment." But really low-power is also quite active. "I'd say the tinyML segment is just as hot. ..." Most of the devices are dedicated to the "inference" stage of AI, where artificial intelligence makes predictions based on new data. Inference happens after a neural network program has been trained, meaning that its tunable parameters have been developed fully enough to reliably form predictions and the program can be put into service.


If Scrum isn’t agile, a SaaS solution could be

There are certain situations where a software development organization simply can’t be agile. The first is where there is a fixed deadline. If there is a deadline, you almost certainly can’t make adjustments that may be necessary to finish the project. Changes to what you are coding almost always mean a change to the date. What happens if, when you do the sprint review with the customer, the customer asks for changes? You can’t roll back a whole sprint, redo the work in a different way, and then still ship according to a contracted date. If you can’t adapt to changing requirements, then you aren’t being agile. The second is when there is a fixed cost. This works very similarly to what I described in the previous paragraph — if you are only getting paid a set amount to do a project, you can’t afford to redo portions of the work based on customer feedback. You would very likely end up losing money on the project. Bottom line: If you have a fixed deadline or a fixed amount of time, you can’t strictly follow Scrum, much less the Agile Principles.


Can an AI be properly considered an inventor?

Thaler, an advocate of recognizing these devices as inventors, clearly believes the time has come, stating, “It’s been more of a philosophical battle, convincing humanity that my creative neural architectures are compelling models of cognition, creativity, sentience, and consciousness. … The recently established fact that DABUS has created patent-worthy inventions is further evidence that the system ‘walks and talks’ just like a conscious human brain.” (We should bear in mind, however, that an “author” in copyright is not an identical legal construction with that of an “inventor” in the domain of patents, but they are closely related concepts.) We also need to consider that the South African patent system does not involve an examination of the substance of an application, but unlike in a lot of countries leaves both first consideration and final resolution of patent validity to the courts, and so the patent grant was in some sense automatic and not policy-driven.) Importantly, the U.S., the U.K., and the European Patent Offices (all of which do preliminary consideration of patentability) rejected this same patent application on the basis of its ineligibility. 


MoleRats APT Flaunts New Trojan in Latest Cyberespionage Campaign

The campaign uses various phishing lures and includes tactics not only to avoid being detected but also to ensure that its core malware payload only attacks specific targets, Proofpoint researchers wrote in the report. Some of the attacks observed by the team also delivered a secondary payload, a trojan dubbed BrittleBush, they said. NimbleMamba, delivered as an obfuscated .NET executable using third-party obfuscators, is an intelligence-gathering trojan researchers believe is a replacement for previous malware used by TA402, LastConn. “NimbleMamba has the traditional capabilities of an intelligence-gathering trojan and is likely designed to be the initial access,” researchers wrote. “Functionalities include capturing screenshots and obtaining process information from the computer. Additionally, it can detect user interaction, such as looking for mouse movement.” MoleRats is part of the Gaza Cybergang, an Arabic speaking, politically motivated collective of interrelated threat groups actively targeting the Middle East and North Africa.



Quote for the day:

"Leadership is a process of mutual stimulation which by the interplay of individual differences controls human energy in the pursuit of a common goal." -- P. Pigors

Daily Tech Digest - February 11, 2022

France’s privacy watchdog latest to find Google Analytics breaches GDPR

The decision on this complaint has clear implications for any website based in France that’s currently using Google Analytics — or, indeed, any other tools that transfer personal data to the U.S. without adequate supplementary measures — at least in the near term. For one thing, the CNIL’s decision notes it has made “other” compliance orders to website operators using Google Analytics (again without naming any sites). While, given joint working by EU regulators on these 101 strategic complaints, the ramifications likely scale EU-wide. The CNIL also warns that its investigation — along with the parallel probes being undertaken by fellow EU regulators — extends to “other tools used by sites that result in the transfer of data of European Internet users to the United States”, adding: “Corrective measures in this respect may be adopted in the near future.” So all U.S.-based tools that are transferring personal data are facing regulatory risk. We’ve asked the CNIL which other tools it’s looking at and will update this report with any response.


Security Risks Facing Web3 Developers

One of the challenges to securing dApps in the new Web3 world is engaging security professionals in a meaningful way. A number of the cybersecurity experts I follow on Twitter have been dismissive of Web3 and blockchain technologies as fads at best and scams at worst. I asked Spanier what it will take to get more of these folks to engage with Web3. “For security professionals, here’s some advice to figure out if blockchain security interests you,” he replied. “Treat your initial plunge as an exploratory journey. Look at different security issues that have manifested themselves in the past, be they with smart contracts or core blockchains. These projects are mostly open, so you can look at their Github issues and patches. Review vulnerability write-ups and deconstructions of previous attacks. Projects affected by a compromise will typically post detailed write-ups. This would be a good start.” There’s a lesson for developers here too. Because so much of what’s being developed for Web3 is done in a very public way, there’s an opportunity to avoid the mistakes of others. As you develop, consider doing a review of mistakes made by others a part of your release process.
    

How to maximise value from IT vendor collaborations

It goes without saying that not everyone can be a maestro of everything. This is hugely applicable to IT businesses who tackle complex tasks across several technologies and infrastructures on a daily basis. It becomes clear that IT companies that are running their strategies solo are not taking advantage of the possible strengths of great IT vendor collaborations, which include several experts of different skills across a plethora of applications. In other words, it is likely that an IT partner with an appropriate talent and skillset would be capable of selecting the right solution for the client, rather than the primary vendor that does not specialise in the client’s specific needs. In order to develop and nurture productive partnerships, IT businesses must know when to partner, who to partner with, and how to add value to a partnership. A partnership is a value added relationship that develops over time based on the foundation of trust. It takes equal endeavour from both sides to evolve into an IT partnership enabling both the parties to share their ideologies, work cultures, expertise, and strategies. 


Embracing Agile Values as a Tech and People Lead

A key agile principle to me is “Embrace Change”, the subtitle of XP Xplained by Kent Beck. Change is continuous in our world and also at work. Accepting this fact makes it easier to let go of a decision that was taken once under different circumstances, and find a new solution. To change something is also easier if there is already momentum from another change. So I like to understand where the momentum is and then facilitate its flow. We had a large organizational change at the beginning of 2020. Some teams were newly created and everyone at MOIA was allowed to self-select into one of around 15 teams. That was very exciting. Some team formations went really well. Others didn’t. There were two frontend developers who had self-selected into a team that had less frontend work to do than expected. These two tried to make it work by taking over more responsibility in other fields, thus supporting their team, but after a year they were frustrated and felt stuck. Recognizing the right moment that they needed support from the outside to change their team assignment was very important.


AI Biometric Authentication for Enterprise Security

Biometric authentication technology has been an important industry trend for years, especially in 2021 due to the latest AI innovations available on the market. According to IBM, 20% of breaches are caused by compromised credentials. Worse, it can take an average of 287 days to identify and respond to a data breach. AI-based security is increasing in usage and will be necessary to remain competitive in any industry. IBM reports that as of 2021, 25% of businesses have completed deployment of AI-based security, while 40% are partially deployed. The remaining 35% have not begun this process, and if your business falls into this category you may be placing your clients at great risk for dangerous data breaches. Investing in AI-based security can save a business up to $3.81 million in 2021. Being able to use artificial intelligence to identify and automatically respond to data breaches is incredibly important for protecting the data and privacy of a company and its customers. AI biometrics authentication provides yet another safeguard against a data breach that is essential for businesses of any scale.


Graphene biosensor will drive new innovations in brain-controlled robotics

Recently, researchers have developed EEG sensors made from graphene, which offers excellent conductivity and biocompatibility. Graphene-based biosensors, however, often have low durability, corroding upon contact with sweat, and exhibit high skin-contact impedance that hampers the detection of signals from the brain. A novel graphene-based biosensor developed at the University of Technology Sydney aims to overcome these limitations, detecting EEG signals with high sensitivity and reliability – even in highly saline environments. The sensor, described in the Journal of Neural Engineering, is made from epitaxial graphene (EG) grown on a silicon carbide (SiC)-on-silicon substrate. This structure unites graphene’s favourable properties with the physical robustness and chemical inertness of SiC. “We’ve been able to combine the best of graphene, which is very biocompatible, very conductive, with the best of silicon technology, which makes our biosensor very resilient and robust to use,” says senior author Francesca Iacopi in a press statement.


Is artificial intelligence better than humans at making money in financial markets?

Our immediate observation was that most of the experiments ran multiple versions (in extreme cases, up to hundreds) of their investment model in parallel. In almost all the cases, the authors presented their highest-performing model as the primary product of their experiment – meaning the best result was cherry-picked and all the sub-optimal results were ignored. This approach would not work in real-world investment management, where any given strategy can be executed only once, and its result is unambiguous profit or loss – there is no undoing of results. ... Despite all their imperfections, empirical evidence strongly suggests humans are currently ahead of AI. This may be partly because of the efficient mental shortcuts humans take when we have to make rapid decisions under uncertainty. In the future, this may change, but we still need evidence before switching to AI. And in the immediate future, we believe that, instead of pinning humans against AI, we should combine the two. This would mean embedding AI in decision-support and analytical tools, but leaving the ultimate investment decision to a human team.


Future for Careers in Automation Looking Bright

Trivedi points out there are many places where people can begin their journey, starting in core-IT or IT engineering, before moving into automation engineering/SRE (Site Reliability Engineer). “Another good path is to start off in the technical support organization and learn more about the technical side of the product,” he says. “Then, you can take on small projects that help automate support issues, and you can use this experience to move into SRE. Many people also start off in a systems administration role before moving into an engineering role.” Nirmal said entry-level professionals with an understanding and passion for automation technologies have “endless opportunities” to embark on a career path that provides tremendous value to a company's digital transformation and future growth. A key change in how organizations are approaching automation is through expanded use of AI and machine-learning. This means IT workers must have knowledge of how AI advances automation and allows for more informed decisions that improve outcomes.


Pay attention to your attention

An overly intense focus on a goal can lead to what cognitive psychologists call goal neglect. That may seem counterintuitive to the average goal-oriented MBA or entrepreneur, but take, for example, the dynamic at work in micromanagement. Often, when leaders micromanage employees, an intense focus on task performance distracts those leaders from the larger goals of the company. They obsess over the trees and neglect the forest—and drive employees crazy while they’re at it. Where you direct your focus is a function of the brain’s attention system. This system has three subsystems, which Amishi Jha, a professor and the director of contemplative neuroscience for the Mindfulness Research and Practice Initiative at the University of Miami, describes as the flashlight (or orienting system), which enables you to selectively direct and concentrate your attention; the floodlight (or alerting system), which enables you to take in the larger picture; and the juggler (or executive function), which enables you to align your actions to your aims. “What happens with goal neglect is that the flashlight is pointed very intently, but the floodlight is not quite working,” she told me in a recent Zoom interview.


Governance: Your Data Mesh Self-Service Depends on It

Federated governance is a balancing act. While a producer of a data product should have full autonomy to build, populate and publish in any way they see fit, they must also ensure that it is in a form that is easy and reasonable for consumers to access and use. There are many parallels that can be drawn between the microservices domain and the data mesh domain: Both empower users to select the tools and technology that is best suited for their use cases while simultaneously offering resistance to technological sprawl, confusing implementations and difficulty in usage. For example, a microservice platform may restrict the languages that developers may use to a specific subset. In the data mesh, a similar analogy would be to restrict the format of data products such that only one or two mechanisms are the usable standards. In both cases, the goal isn’t to make life more difficult for the creators, but rather to limit the technological sprawl and implementation complexity, particularly if existing technologies and standards are more than sufficient to meet the product needs. 



Quote for the day:

"The actions of a responsible executive are contagious." -- Joe D. Batton

Daily Tech Digest - February 10, 2022

What is Relational Machine Learning?

Much of the recent deep learning research was then about discovering models and learning representations capturing data in various forms of sets and graphs. However, it is only rarely acknowledged that these structured learning representations have for long been studied in Relational Machine Learning. A relation, as you might recall, is a subset of a cartesian product defined over some sets of objects. Every set is thus simply a degenerated case of some (unary) relation. Every graph can then be seen as an instantiation of a binary relation over the same set of objects (nodes). Tabular data, with more than 2 columns (objects), then correspond to relation of a higher arity, also known as a hypergraph. Add multiple such relations (tables) over the objects, and you have a relational database. Much of the real-world data is then stored in such relational databases, which you have certainly encountered before. Now imagine that your learning samples are not prepared nicely as rows in a single table, but spread across multiple interlinked tables of the database, where different samples consist of different types and numbers of objects, with each object being characterized by a different set of attributes.


Blockchain technology in financial services

Decentralised financial operations have benefitted greatly from networks built on public blockchain. Hosting transaction data on these networks allow for transparency and visibility from all users involved. However, Conor Svensson, founder and CEO of Web3 Labs, believes that regulators need to bring this infrastructure further up in their agendas, for financial service bodies to drive true value. Svensson explained: “We have seen financial institutions offer institutional cryptocurrency product to meet the demands of institutional investors. Whilst on the surface, this may not sound like a key value driver, other than providing additional revenue sources for the firms making such access possible, I believe it is a key step in facilitating mass adoption of some of the key innovations taking place on public blockchain networks such as Ethereum. “With the innovations during the past two years in DeFi, investors are able to get yields that far surpass what is currently available in traditional bank savings accounts. The savvy users of DeFi, can see returns on crypto assets that are pegged to the US dollar achieve yields of 7%+ per annum, far exceeding what people can get in bank accounts. 


The power of AI can be unleashed with a focus on ethics

Consumers’ concerns about AI should not be ignored – and they are not insurmountable. As businesses invest in AI, they need to adopt a wider perspective that is not focused purely on the technology itself. There are three strands to this approach for government, regulators and businesses to tackle, each of which can address ethical questions and improve trust and confidence. The first of these is building trust into the foundations of AI through a human-led approach, starting with the design of AI algorithms. Input data needs to be bias-free, while the people behind AI tools need to have the right training to ensure bias-free outputs. Above all, there should be transparency and accountability in how decisions are made by AI. Citing technological limitations as a reason for a lack of transparency or not owning accountability for AI’s decision-making won’t be accepted by consumers – or regulators. To bridge the trust gap, collaboration is key and AI stakeholders must work as an ecosystem to mitigate risks. Businesses must ensure AI is a board-level agenda item and a core part of their overall strategy, and there must be a proactive dialogue between government, suppliers, consumers and regulators.


Alibaba Open-Sources AutoML Algorithm KNAS

Researchers from Alibaba Group and Peking University have open-sourced Kernel Neural Architecture Search (KNAS), an efficient automated machine learning (AutoML) algorithm that can evaluate proposed architectures without training. KNAS uses a gradient kernel as a proxy for model quality, and uses an order of magnitude less compute power than baseline methods. The algorithm and a set of experiments were described in a paper published in Proceedings of Machine Learning Research. Unlike many AutoML algorithms which require that proposed models undergo the full training process, KNAS can predict which model architectures will perform well without actually training them, thereby potentially saving many hours of compute time. When evaluated on the NAS-Bench-201 computer vision benchmark, KNAS achieved a 25x speedup while producing results with "competitive" accuracies compared to other AutoML methods. When evaluated on text classification tasks, KNAS produced models that achieved better accuracy than a pre-trained RoBERTA baseline model.


Charge DeFi: How Algorithms Create Stability in a Decentralized Way

Charge DeFi is a combination of an algorithmic crypto token and rebase mechanics. A stablecoin is a cryptocurrency whose value is pegged to a single unit of a fiat currency, usually 1 USD. Normally, this is by means of “tethering” in which a company acquires an equivalent amount of say, USD and promises to back each unit of stablecoin 1:1. However, inherent in this mechanism is the requirement to trust the guarantor, which requires constant and often expensive monitoring. An algorithmic crypto token takes stability to the next level. Instead of a fixed peg, an algorithm is used to adjust the price of a token based on pre-set conditions which can be written into a smart contract and launched in a fully decentralized way. Consequently, there is no input by any 3rd party thereafter, with the algorithm executing according to demand, supply and market movements. ... One of the core features of this new ecosystem is the rebase mechanic implemented in the contracts. Rebase mechanics implement price-elastic tokens that adjust the circulating supply to influence a token price. 


The innovation behind AI at Scale

One important approach to advancing model capabilities is the training of expert models with subtasks using an ensemble method, named Mixture of Experts (MoE). The MoE architecture also preserves sublinear computation with respect to model parameters that provides a promising path to improve model quality via scaling out trillions of parameters without increasing training cost. We have also developed MoE models to our pretrained model family accelerated with DeepSpeed. These large-scale pre-trained models become platforms and can be adapted to specific domains or tasks by using domain-specific data in a privacy-compliant manner. We refer to this collection of base and domain-adapted models as “AI models as a platform,” which can be used directly to build new experiences with zero-shot/few-shot learning or used to build more task-specific models through the process of fine-tuning the model with a labeled dataset for the task. In a similar fashion, you can domain adapt or fine-tune these models with your own enterprise data privately within the scope of your tenant and use them in your enterprise applications to learn representations and concepts unique to your business and products.


IT leadership: 8 tips to improve resiliency

There’s no rest for IT, though. The year ahead promises to put technology organizations to the test as they enable enterprises to meet rapidly changing customer demands, create better employee experiences, mitigate complex security challenges, effectively and ethically integrate emerging artificial intelligence (AI) tools, consider the impact of climate change, and ensure resilient business systems in an uncertain global environment. The only way for CIOs to tackle these challenges – and avoid burnout – is to improve their own resilience as leaders and as individuals. “When we’re exhausted and we’re burned out, we cannot think big,” says Adam Markel, author of the forthcoming book, Change Proof: Leveraging the Power of Uncertainty to Build Long-Term Resilience. “And [IT leaders] have to constantly be bigger than the problems that are presented. They have to be able to think outside of the bounds of whatever challenge or problem is coming at them.” To develop resilience, it’s important to understand what it is – and isn’t. “Resilience is actually about how we recharge, not about how we endure,” says Markel. 


TypeScript and the Power of a Statically-Typed Language

TypeScript came about within Microsoft in 2012 as a way to help their C++ or C# developers to write large-scale web applications, recalled Hoban, who was then part of the developer tools group at Microsoft. The natural choice was to use JavaScript, the dominant language for the web. But JavaScript didn’t offer much tooling to help developers organize, assist and check their code writing. Various early solutions worked on by Microsoft and other web-scale companies were mostly efforts to compile traditional languages such as C++ into JavaScript, though this left developers at “an arm’s length from the platform,” Hoban said. They wanted to develop a language that had the “ability to express types into the syntax as a lightweight extension to JavaScript, but … still offer as much sort of inference as possible, so that you need to use those annotations sort of as infrequently as possible,” Hoban said. The answer came in making a superset of JavaScript that would be fully compatible with JavaScript and can be transpiled into JavaScript for full browser compatibility. To add rigor, it came with additional type checking. 


Turn the “great resignation” into the “great renegotiation”

Workers are leaving their employers in droves, seeking greater fulfillment and better pay, among other opportunities. This great resignation, though, could instead be a great renegotiation. Leaders have a chance now to redesign their organizations in a way that’s more exciting and fulfilling for employees. Doing so requires businesses to rethink their fundamental ways of operating. It means putting everything on the table, including roles, schedules, key performance indicators, individual performance metrics, and more. It takes time, energy, and work. But the incentive is clear. ... Internal negotiations are often much more difficult than external ones, INSEAD professor Horacio Falcão and financial services professional Alena Komaromi explain in the INSEAD Knowledge blog. People who work together often assume they’ll have similar aims, so they’re more likely to be underprepared and fail to consider one another’s conflicting interests. Internal power dynamics also often hamper negotiations. It helps to train employees on how to negotiate.


Cross-Architecture Capabilities: Thinking With GPUs

Outside of specialized applications, most programmers have traditionally kept their focus on CPUs. Until recently, CPUs were the only substantial processing power in most machines. However, with powerful GPUs now a standard component in a wide variety of devices, the CPU is no longer the only game in town. Developers can leverage the additional processing power of GPUs and other processors in conjunction with traditional CPUs. To do so, they need to adopt parallel programming techniques to create applications with cross-architecture capabilities. Intel’s oneAPI provides a powerful suite of tools for creating software that takes full advantage of cross-architecture resources. oneAPI enables developers to think about the possibilities of enhancing their applications with some extra GPU processing power. One of oneAPI’s critical features is code reusability. Using Intel’s Data Parallel C++, an implementation of C++, and SYCL standards for data parallelism, developers can write code that works with CPUs, GPUs, and FPGAs. 



Quote for the day:

"Hiding from yourself is the surest path to self hatred, self pity and a whole lot of missed potential." -- Jon Westernberg

Daily Tech Digest - February 09, 2022

AI and Machine Learning Salaries Drop

According to Dice’s numbers, competition for machine learning, natural language processing, and AI experts softened in 2021, with average salaries dropping 2.1 percent, 7.8 percent, and 8.9 percent respectively. This comes on the heels of repeated—and sometimes dramatic—increases in recent years. Average U.S. salaries for software engineers with expertise in machine learning, for example, jumped 22 percent in 2019 over 2018, then went up another 3.1 percent in 2020. “There are a variety of factors likely contributing to [these] decreases,” Dice chief marketing officer Michelle Marian told IEEE Spectrum, “with one important consideration being that more technologists are learning and mastering these skill sets. The increases in the talent pool over time can result in employers needing to pay at least slightly less, given that the skill sets are easier to find. We have seen this occur with a range of certifications and other highly specialized technology skills.” ... Demand seemed to be slowing for cybersecurity analysts and data engineers, however. Up 16.3 percent in 2020 from 2019, the average cybersecurity analyst salary fell 0.8 percent in 2021 to $102,253, and data engineers saw an average salary drop of 1.1 percent, to $117,295, after a 4.7 percent increase in 2020.


A New Trick Lets Artificial Intelligence See in 3D

The new approach involves using a neural network to capture and generate 3D imagery from a few 2D snapshots, a technique dubbed “neural rendering.” It arose from the merging of ideas circulating in computer graphics and AI, but interest exploded in April 2020 when researchers at UC Berkeley and Google showed that a neural network could capture a scene photorealistically in 3D simply by viewing several 2D images of it. That algorithm exploits the way light travels through the air and performs computations that calculate the density and color of points in 3D space. This makes it possible to convert 2D images into a photorealistic 3D representation that can be viewed from any possible point. Its core is the same sort of neural network as the 2012 image-recognition algorithm, which analyzes the pixels in a 2D image. The new algorithms convert 2D pixels into the 3D equivalent, known as voxels. Videos of the trick, which the researchers called Neural Radiance Fields, or NeRF, wowed the research community. “I’ve been doing computer vision for 20 years, but when I saw this video, I was like ‘Wow, this is just incredible,’” says Frank Dellaert, a professor at Georgia Tech.


4 Exciting DeFi Projects Worth Watching In 2022

Parallel Finance is another promising DeFi 2.0 project that aims to add liquidity and increase the scope of DeFi across the Polkadot blockchain. This decentralized money market protocol offers many products, including lending, staking, and borrowing within the Polkadot ecosystem. The core products offered by Parallel Finance include an AMM, the first money market for Polkadot and Kusama ecosystems, insured staking, margin staking, leverage staking, and algorithmic staking. Parallel Finance’s unique lending design can disrupt the DeFi 1.0 narrative, adding higher yields and increased opportunities for participants. ... By combining three different decentralized Polkadot-powered protocols into a unified solution, Parallel Finance unlocks innovative yield opportunities and enables participants to lend and stake simultaneously. To date, Parallel Finance has the highest TVL in the Polkadot ecosystem after winning the fourth Kusama parachain slot by locking up more than $239 million worth of KSM tokens. Besides this, Parallel Finance is backed by industry leaders like Lightspeed, Polychain Capital, Alameda Research, and Blockchain Capital.


How Attack Surface Management Preempts Cyberattacks

ASM is a technology that either mines Internet datasets and certificate databases or emulates attackers running reconnaissance techniques. Both approaches aim at performing a comprehensive analysis of your organization's assets uncovered during the discovery process. Both approaches include scanning your domains, sub-domains, IPs, ports, shadow IT, etc., for internet-facing assets before analyzing them to detect vulnerabilities and security gaps. Advanced ASM includes actionable mitigation recommendations for each uncovered security gap, recommendations ranging from cleaning up unused and unnecessary assets to reduce the attack surface to warning individuals that their email address is readily available and might be leveraged for phishing attacks. ASM includes reporting on Open-Source Intelligence (OSINT) that could be used in a social engineering attack or a phishing campaign, such as personal information publicly available on social media or even on material such as videos, webinars, public speeches, and conferences.


6 reasons we need to start thinking about quantum computing now

Companies are exploring applications that could help with portfolio optimization, secure communications, transaction settlements and ultrafast trading platforms. ... With quantum computing, they crunched the data in less than three minutes, as compared with 30 hours with a traditional computer. ... Today’s customers expect high degrees of personalization, and quantum computing can help deliver on that expectation — again through its unique power to analyze huge numbers of potential combinations of options or solutions. “There’s clearly a trend toward the idea of ‘bank of me,’ ” Flinter says. “Customers are looking for highly personalized solutions, and I think quantum becomes a part of that solution.” ... In the past few years, quantum computing has moved from the research lab into engineering, and it can quickly solve important, complex problems as it simultaneously considers all possible solutions. Though likely years away, bad actors using quantum computing’s power may be able to more easily decrypt messages sent using existing public-key cryptography and ultimately compromise the security of key systems that are not quantum-proofed.


Quantum Complexity Tamed by Machine Learning

A major goal of DFT research is to find more accurate approximations of that universal functional. John Perdew, a physicist at Temple University and a leading functional developer, has long spearheaded this work. He describes the path toward the universal functional as like climbing the rungs on a ladder. On each rung, physicists add new ingredients to the functional. The simplest ingredient is just the thickness of the electron stew in each location. On the next rung, the functional also considers how quickly the thickness changes from place to place, giving researchers a broader view and making the functional more precise. A key part of Perdew’s strategy is to use physical reasoning to identify certain mathematical properties that good approximations must obey, known as “exact constraints.” Higher rungs satisfy more of these constraints, and researchers have to search harder to find equations that obey them all. Perdew’s group started tackling third-rung functionals, which blend six ingredients, in 1999, and in 2015, he released a state-of-the-art functional called SCAN. It was his eighth attempt, and the first to obey all 17 known constraints relevant on the third rung.


To save independent journalism, media must embrace web3 innovation

There’s been a lot of talk about journalism and artificial intelligence (AI). Importantly, news organizations have been concretely working on a wide variety of AI solutions, from content recommendation and moderation to advanced analytics, deep-fake detection and optimizing subscription pipelines. But have we seen anything truly disruptive and transformative yet? Something that will really move the needle for news organizations in the longer run, giving them a business advantage in the digital information ecosystem? Adoption of AI technologies is still slow in the field partly due to the high cost of development. Additionally, news media needs more use cases and business cases for AI that have been born out of pure journalistic thinking and need. What about web3 and metaverse, and their combination? So far, news organizations haven’t actively explored or developed solutions for these emerging digital ecosystems. Yes, there is a lot of hype and misconception related to both, but it doesn’t mean that web3 and metaverse won’t become “the next big thing.”


Detect active network reconnaissance with Microsoft Defender for Endpoint

Spending quality time on reconnaissance is guaranteed to pay dividends at subsequent stages of the cybersecurity kill chain and is a crucial indicator from which to detect malicious activity at the earliest possible stages of an attack. Unfortunately, it is also a focus area that is overlooked by security professionals in favor of exotic phases of an attack such as weaponization, delivery, and exploitation. This is because it can be notoriously difficult to detect reconnaissance attacks—as they are often passive—executed in the public domain and external to the victim’s network. Job postings, domain registrars, and financial reporting are great resources to gain context on a target, but due to their nature, it can be a challenge to even seasoned security operations teams to detect and investigate this type of attack. Fortunately, there is a certain point where passive activities can have diminishing returns as the data points produced can be too broad to act upon. This can drive an attacker to consider the use of active reconnaissance which has dependencies on the endpoint or human interaction.


5 Reasons You Don’t Always Need a JavaScript Framework

With the release of JavaScript ES6, the necessity for frameworks isn’t nearly as great as it once was. And with the inclusion of modules and classes with ES6, most of the popular frameworks could be thought of as obsolete. Remember, frameworks are used to abstract some of the more complex aspects of development. With the release of ES6, a number of new features (such as default parameters, template literals, multi-line strings, destructuring assignment, enhanced object literals, and arrow functions — along with modules and classes), reduces the need to even bother with frameworks for many use cases. Prior to ES6, working with things like classes was a major challenge, even though previous iterations were designed to specifically support classes. These new features of JavaScript go a long way towards making frameworks a thing of the past, because they bring certain functionalities to the language that were previously unavailable. Instead of having to employ those third-party tools, most everything will be pre-packaged within the language itself.


Web3: Empty Buzzword Or The Next Evolution Of The Web?

Web3 is the vision to flip this paradigm on its head. The end goal is a re-architecture of the internet enabling users to own their data in decentralized environments through the use of open source protocols and applications. This model may eliminate the need for centralized gatekeepers, or at least reduce reliance on individual companies. If this vision is properly manifested, data would be ownable and easily transferable by users, and users would be compensated directly for contributing to the network. Web3 aims to make users “uncancellable” as it strips power away from centralized service providers that can unilaterally deplatform users and disconnect them from their audiences. This empowers users to more freely express themselves, produce content and have interactions without fear of personal or monetary consequences, increasing value for all participants within a network. This can even have political ramifications as it may encourage activists and dissidents to pursue their goals more openly and vigorously.



Quote for the day:

"When you expect the best from people, you will often see more in them than they see in themselves." -- Mark Miller

Daily Tech Digest - February 08, 2022

Top 10 Anticipated Web 3.0 Trends For 2022

Web 3.0 will ensure that peer-to-peer regulations are properly known and learned through blockchain. This is to bring cryptography and consensus algorithms together to measure the decentralization methods and to be an alternative for the currently used standard databases. Decentralization ensures the sole ownership of the user’s data. It would mean that only the said user will have access to whatever data is being uploaded, altered, saved and utilized. No third person is involved (the government, for example), neither can anyone dictate as to when and how to use data. ... Social media is developing its platforms on the decentralized technology of Web 3.0. This would mean that the centralized features will no longer (or partly) be available on social media platforms in the near future. Blockchain ledgers will be used to construct the new social media industry. Web 3.0 has solved problems such as privacy breaching, mismanaged data, and unauthentic and irrelevant information that have been part of the previous generation of the internet. It offers a safe and secure place for users to participate. Decentralization ensures protection and security to every piece of data added to the internet.


DAOs are meant to be completely autonomous and decentralized, but are they?

If DAOs are to remain true to their nature where the community is able to make decisions equally, decentralization needs to happen in stages. However, providing a certain level of control is required so that common prosperity is maintained among the organization. While involved communities should be given the power to make proposals and decisions, gatekeepers or councils may be required that can effectively maintain the core values of the company. Most successful DAOs including Uniswap, MakerDAO, PieDAO, Decred and more have different systems of gatekeeping where proposals go through various stages before being accepted. For example, Uniswap’s governance protocol has multiple stages of execution before any proposal is accepted. Its last stage is a group of elected users that have the power to halt the implementation of any proposals it deems malicious or unnecessary. On the other hand, MakerDAO has a more open community where people don’t need to hold their token to participate in off-chain voting. Yet, its proposals undergo strict scrutiny.


Database Management Trends in 2022

Augmented Data Management uses machine learning and artificial intelligence to automate Data Management tasks, such as spotting anomalies within large amounts of data and resolving Data Quality issues. The AI models are specifically designed to perform Data Management tasks, taking less time and making fewer errors. Todd Ramlin, a manager of Cable Compare, in describing the benefits of augmented Data Management, said, “Historically, data scientists and engineers have spent the majority of their time manually accessing, preparing, and managing data, but Augmented Data Management is changing that. ADM uses artificial intelligence and machine learning to automate manual tasks in Data Management. It simplifies, optimizes, and automates operations in Data Quality, Metadata Management, Master Data Management, and Database Management systems. AI/ML can offer smart recommendations based on pre-learned models of solutions to specific data tasks. The automation of manual tasks will lead to increased productivity and better data outcomes.”


How open source is shaping data storage management

While open source data storage software is cost-effective, there is a big difference between downloading a project for free and trying it out in a developer machine versus using it to power mission-critical applications that have stringent requirements such as stability, high availability and security. Ghariwala notes that enterprises will need strong technical resources to architect a solution that supports their mission-critical application requirements as well as dedicated resources to triage production issues. ... The second challenge that enterprises may face is related to flexibility which is not guaranteed when using open source technologies. Ghariwala says the problem generally arises when vendors only support their own technologies with their commercial open source solutions, creating lock-in and limiting an organisation’s ability to choose the right solution for their needs. Danny Elmarji, vice-president for presales at Dell Technologies in Asia-Pacific and Japan, notes that some Dell customers are starting to define and use their own software storage that runs on Dell’s hardware and compute, leveraging open-source contributions.


What Is Object Storage?

The database retains a unique identifier for each object. The 64-bit Object ID (OID) indicates the location of the object on a single storage medium or among a cluster of storage devices. Unlike block storage, which allocates storage in predefined blocks of equal length, the lengths of objects can vary. As noted, the relatively simple system of keeping track of objects makes it possible to extend a single object storage system across multiple storage resources. A file storage system, on the other hand, has a defined limit on the number of files it can manage. While some NAS file systems may be quite large, they generally can’t expand to the degree that object storage can. Another distinguishing characteristic of object storage is the way it handles metadata related to each stored object. A file system -- like the Windows file directory on a PC or a shared NAS system -- includes some basic metadata related to each file it manages, such as file name, file size, date created, date modified and possibly the application it’s associated with.


What Architects Need to Know

Dealing with Business Concepts – while this one should be a no brainer, it is met with open scorn in many places, business skills are reserved only for the highest level architects. These concepts include Business Models, Customer Journeys with Personas, Capabilities with Objectives, Value Methods, Investment Planning with some Roadmapping. ...  Technology Design and Delivery – this is a deep and interesting dialog in industry, how much business AND how much technology? If a product owner wants to become an architect, what technology should they learn? How deep do they go? At a minimum, Design including Patterns, the primary Requirements/Decisions/Quality Attributes relationships, Architecture Analysis, Deliverables, Products/Projects, Services, and Quality Assurance. ... Dealing with Stakeholders – often overlooked, always under-trained, and never enough time or techniques, dealing with stakeholders is the hardest part of the job. Humans are mercurial, the lines of decision traceability and influence are blurred, it is effectively chaos in the lifecycle management of companies with lots of petty power plays and even more in terms of financing and final outcomes.


BigQuery vs Snowflake: The Definitive Guide

Snowflake offers an auto-scaling and auto suspend feature that enables clusters to stop or start during either busy or idle periods. With Snowflake your users cannot resize nodes, but they can resize clusters in a single click. Additionally, Snowflake enables you to auto-scale up to 10 warehouses with a limit of 20 DML per queue in a single table. On a similar note, BigQuery automatically provisions your additional compute resources as needed and takes care of everything behind the scenes. ... Both platforms let you scale up and down automatically based on demand. Additionally, Snowflake gives you the ability to isolate workloads across businesses in different warehouses so that different teams can operate independently with no concurrency issues. ... Snowflake automatically provides encryption for data at rest. However, it does not provide granular permissions for columns, but it does provide permissions for schemas, tables, views, procedures, and other objects. Conversely, BigQuery provides security at a column-level as well as permissions on datasets, individual tables, views, and table access controls.


4 metaverse tools that tackle workplace collaboration

By now, most of us have come to realize that the next normal won’t look much like it used to. The pandemic has taught us that turbulent and unpredictable times require flexibility and an open mind.Meanwhile, technology companies have been delivering highly competitive technologies to win both mind and market share. ... Facebook is so committed to the metaverse that it even changed the company’s name to Meta. Meta is also looking at ways to bring the metaverse to the workplace: Its Horizon Workrooms enables users to wear a virtual reality (VR) headset to feel like they’re attending an in-office meeting. Meanwhile, Microsoft is also working on bringing the metaverse to work. In 2022, Microsoft Teams users will be able to replace their video streams with 3D avatars of themselves. On the plus side, this lets people maintain a physical presence even when they’re not feeling particularly camera-ready. But at the same time, replacing ourselves with idealized avatar caricatures may further exacerbate the mental health impact of seeing our natural faces ‒ and all of our flaws ‒ filtered away. 


Five blockchain use cases: from property to sustainability

“Blockchain could significantly enhance upstream, midstream and downstream operations throughout the oil and gas sector. It has the potential to make a great deal of the sector’s bureaucracy significantly more efficient, for example making it easier and quicker to confirm when third-party suppliers complete tasks so that funds can be released in a far more timely way. It can also be used to monetise reserves in a way that has not previously been possible, tokenising confirmed but not yet exploited deposits to help investors, exploration and production firms, and refining and processing operations, manage their activities and balance sheets. “The deeper we look at the potential of the blockchain in the oil and gas sector, the wider the range of opportunities from digitalising global oilfield datasets becomes. Distributed ledger technology allows for permanent transparency on a trust-protocol that integrates cloud-based servers. The approach that we are taking requires graphic processing units and high-performance computers. ...”

Communicating the Importance of Engineering Work in Business Terms

Whenever you can, bring data to the discussion. This data should be metrics related to business outcomes. Measure things like bug rates, average time it takes to deliver a feature, employee satisfaction, and customer satisfaction. A great set of metrics to pull out are the ones that come from Accelerate. You may need to show how metrics such as lead time, deploy frequency, mean time to recovery, and change failure rate directly predict improved business outcomes. But as much as possible, use metrics that speak to the problems and concerns that are top of mind for your partner. For example, let's say you are seeing that engineers are really struggling to understand a particular component - it is complex, poorly tested and poorly documented. What's the business impact of this? Likely, it takes longer to ship a feature because of a long cycle of testing and debugging, and even when it is shipped, it's probably going to have more bugs. So maybe look at the time to build a feature that touches this component versus ones that don't, and see if you can show a significant difference.




Quote for the day:

"Good management is the art of making problems so interesting and their solutions so constructive that everyone wants to get to work and deal with them." -- Paul Hawken

Daily Tech Digest - February 07, 2022

Despite recent progress, AI-powered chatbots still have a long way to go

Even state-of-the-art systems struggle to have a human-like conversation without tripping up, clearly. But as these systems improve, questions are arising about what the experience should ultimately look like. Values, dialects, and social norms vary across cultures, ethnicities, races, and even sexual identities, presenting a major challenge in designing a chatbot that works well for all potential users. An ostensibly “safe,” polite, and agreeable chatbot might be perceived as overly accommodating to one person but exclusionary to another. Another unsolved problem is how chatbots should treat controversial topics like religion, illegal activities, conspiracy theories, and politics — or whether they should opine about these at all. A recent paper coauthored by researchers at Meta explores the potential harm that might arise from chatbots that give poor advice, particularly in the medical or psychological realms. In a prime example, OpenAI’s GPT-3 language model can be prompted to tell a person to commit suicide.


Four emerging data integration trends to assess

Modern data integration technologies focus on advanced automation, connected data intelligence and persona-based interactive tooling, helping organisations to accelerate various use cases and other data integration requirements. Distributed hybrid and multicloud data is creating new integration challenges. Data lives everywhere, so centralising it into data lakes or data hubs to support business insights is no longer practical, especially with the explosion of data at the edge. Forrester expects the adoption of data integration systems to proliferate in the coming years as organisations look for supporting insights across multicloud, hybrid cloud and edge environments. Artificial intelligence (AI) is driving the next level of data integration solutions. New and innovative AI features are helping enterprises to automate data integration functions, including data ingestion, classification, processing, security and transformation. Although AI capabilities within data integration are still emerging, areas that technology architecture and delivery leaders can leverage today include the ability to discover connected data, classify and categorise sensitive data, identify duplicates and orchestrate silos.


Engineering EDA and microservices applications for performance

Microservices application architecture is taking root across the enterprise ecosystem. Organizing and efficiently operating microservices in multicloud environments and making their data available in near-real time are some of the key challenges enterprise architects confront with this design. Thanks to developments in event-driven architecture (EDA) platforms (such as Apache Kafka) and data-management techniques (such as data mesh and data fabrics), designing microservices-based applications has become much easier. However, to ensure that microservices-based applications perform at requisite levels, you must consider critical non-functional requirements (NFRs) in the design. NFRs address how a system operates, rather than how the system functions (the functional requirements). The most important NFRs involve performance, resiliency, availability, scalability, and security. This article describes designing for performance, which entails low-latency processing of events and high throughput. Future articles will address the other NFRs.


12 CISO resolutions for 2022

“It’s important for our security teams to have visibility into all aspects of cloud applications, on-prem applications, network, services, systems, databases, accounts, third-party providers, etc. to help fortify our cybersecurity defenses,” Karki explains. “Having a complete, accurate and appropriately prioritized inventory of all our hardware, software, and supply chain assets enables our security teams to take a systematic approach to knowing what needs to be safeguarded, what controls to implement to protect, defend, and respond against any adverse events, and how to identify and produce metrics that tell the full story about our current security posture.” ... Although the complexity of that mesh has been growing for years, Van Horenbeeck says events during the past two years such as SolarWinds and Log4j have reinforced for him the criticality of understanding all the moving parts that make up his company’s technology ecosystem. To that end, Van Horenbeeck has invested in technology to gain a fuller understanding of his own company’s IT environment. 


New kids on the blockchain - or more of the same old?

Blockchain and Distributed Ledger Technology (DLT) have been on a downward swing in the hype cycle. The lack of clarity about why some data needs to be on a decentralised network at all remains, as does the suspicion that other ventures may just be offloading energy costs onto customers – no minor concern as prices soar. Meanwhile some recent NFT releases have made non-fungible tokens seem like a satire on the digital economy – a Situationist joke. But one area where blockchain may have useful applications is establishing a secure digital identity, according to a techUK seminar this week. The Zoom event brought together four DLT luminaries, from finance, government, agriculture, and digital identity itself. The intention was to challenge misconceptions and set out a viable route to market, according to Laura Foster, techUK’s Programme Manager for Technology and Innovation. However, she then passed the chair to potentially the most interesting speaker, Genevieve Leveille, key founder and CEO of AgriLedger – a distributed-ledger app for the farming supply chain, who did a good job.


Enterprise architecture in the agile era: Less policing, more coaching

One principle of agile EA is not to boil the ocean by collecting every bit of information about an organisation before providing insights or recommendations, says Gordon Barnett, principal analyst at Forrester. To speed the process, agile EA practitioners refer to a “minimum viable architecture” or “just enough architecture” to meet an urgent business problem, making frequent changes to the EA process as needed. But, Barnett warns, the key is to choose the right elements to include to ensure that such a minimal architecture doesn’t limit its future usefulness. For organisations that are heavily reliant on SaaS applications and the cloud, “a minimum viable architecture helps hold together that distributed ecosystem” with technology standards and more collaborative governance models, even if it doesn’t provide a central repository of the distributed assets that now support the business, Gartner’s Blosch adds. At SYNLABS Jones began by concentrating on “the key pieces of information we needed to understand the business in terms of the application portfolio” and narrowed his search to, at most, “20 pieces of information about an application.”


Four Principles Every Organization Implementing Intelligent Automation Must Live By

Intelligent automation is a subset of artificial intelligence (AI). It is the computerization of processes traditionally carried out by people. Moving beyond current automation technologies (such as robotic process automation), intelligent automation replicates more complex processes — especially those that involve human decision-making. It gives organizations the opportunity to increase efficiency, improve customer experience and generate new revenues through automated digital products and services. But organizations that take to the sky with intelligent automation programs — without properly understanding the success factors — risk dropping quickly back down to Earth. As François Candelon, Rodolphe Charme di Carlo, Midas De Bondt and Theodoros Evgeniou wrote in Harvard Business Review, "For most of the past decade, public concerns about digital technology have focused on the potential abuse of personal data." But now, "attention is shifting to how data is used by the software."


Architecting for Resilience Panel

When it's about starting up new to the technology, there's obviously a strong pull towards the pre, and like, how do we connect with our supply chain, and CI/CDs, and what gets deployed there? Where is the source of truth of configurations of resiliency? Is it in my Git and my Git stuff? Is it in a separate system? Where should I change what? How should it change? A lot of the challenges are around setting up those organizational processes in terms of who changes what, where, and how does that get approved? Ultimately, then it gets to Nora's world, which is, if things do go wrong, who's accountable? How do I recover? Who's alerted? How quickly? Simple things to nail home at one point, which is, we do certain things like, ok, every service, there should be a team that owns it. It should have an owner. It's a very simple concept. You will be surprised how not implemented it is, like our lack of implementation of that. I've heard stories of, this went down, and we went down tracking, and we ended up at a service of like, who wrote this? This dude left three years ago.


Network from home: how data privacy and security responsibilities must be shared

Having entire organisations working remotely is a relatively new phenomenon, but traditional security advice remains very relevant. Remote workers should consider the technologies available to support their remote security needs, such as a password manager that allows you to use a variety of passwords and rotate them often, without having to remember them all, but which makes it difficult for hackers to access your different accounts based on a single master password that could have been compromised when relied upon too often. Equally, employees can upgrade their applications and tools to improve their privacy posture. Search engines and browsers such as DuckDuckGo, Brave Browser and Ecosia that give you more control over your privacy exposure can help minimise the risk of attack and personal information loss. Network firewalls are another tool to consider, which can help upgrade a home network into an environment more consistent with that of the office by monitoring network traffic, blocking malicious websites and allowing you to moderate how others access resources.


The new rules of succession planning

The problem with identifying top candidates often lies in how a short list is generated. Traditionally, the focus is on who the leader is without significant weight put on what skills he or she needs to deliver on the company’s strategy. If succession discussions are to be transformed into more of an upstream process for the board—and members are to have a clear understanding of what the company needs before discussing the best candidates—then the process must account for three distinct and entirely predictable challenges. Because they are predictable, these challenges can be anticipated and overcome. First, start with the what and not the who. Doing so will lay out a more realistic and substantive framework. Second, from this vantage point, try to explicitly minimize the noise in the boardroom. Ensure that the directors are using shared, contextual definitions of core jargon, such as strategy, agility, transformation, and execution. Third, root the follow-on analyses of the candidates in that shared understanding, and base any assessments on a factual evaluation of their track records and demonstrated potential in order to minimize the bias of the decision-makers themselves.



Quote for the day:

"Leadership is the art of giving people a platform for spreading ideas that work" -- Seth Godin