Daily Tech Digest - January 12, 2022

NIST Updates Cybersecurity Engineering Guidelines

NIST’s publication is a resource for computer engineers and other professionals on the programming side of cybersecurity efforts. “This publication addresses the engineering-driven perspective and actions necessary to develop more defensible and survivable systems, inclusive of the machine, physical, and human components that compose those systems and the capabilities and services delivered by those systems,” the document reads. Spanning over 200 pages, the publication takes a holistic approach to systems engineering. NIST researchers give an overview of the objectives and concepts of modern security systems, primarily regarding the protection of a system's digital assets. One of the key updates NIST authors made in the latest version of the publication was a fresh emphasis on security assurances. In software systems engineering, assurance is represented by the evidence that a given system’s security procedures are robust enough to mitigate asset loss and prevent cyber attacks. Ron Ross, an NIST fellow and one of the authors of the document, told Nextgov that system assurances act as justifications that a security system can operate effectively.


9 ways that cybersecurity may change in 2022

On the plus side, digital wallets can ensure the identity of the user in business or financial transactions, reduce fraud and identity theft, and shrink the cost and overhead for organizations that typically create physical methods of authentication. On the minus side, a person can be at risk if their mobile device is lost or stolen, a device without power due to an exhausted battery is of little use when trying to present your digital IT, and any digital verification that requires connectivity will fail if there's no cellular or Wi-Fi available. ... Shadow or zombie APIs pose a security risk, as they're typically hidden, unknown and unprotected by traditional security measures. More than 90% of attacks in 2022 will focus on APIs, according to Durand. And for organizations without the right type of API controls and security practices, these shadow APIs will become the weak link. ... Information technology and operational technology will collide as IT teams assume responsibility for the security of physical devices. This trend will require interoperability between IT and OT, leading to a convergence of technology to determine who can physically get in a building and who can access key applications.


First for software, agile is a boon to manufacturing

Overall, applying agile methodologies should be a priority for every manufacturer. For aerospace and defense companies, whose complex projects have typically followed the long time horizons of waterfall development, agile design and development are needed to propel the industry into the age of urban air mobility and the future of space exploration. ... Over the past decade, agile software development has focused on DevOps—”development and operations”— which creates the interdisciplinary teams and culture for application development. Likewise, design companies and product manufacturers have taken the lessons of agile and reintegrated them into the manufacturing life cycle. As a result, manufacturing now consists of small teams iterating on products, feeding real-world lessons back into the supply chain, and using software tools to speed collaboration. In the aerospace and defense industry, well known for the complexity of its products and systems, agile is delivering benefits.


Observability, AI And Context: Protecting APIs From Today's (And Tomorrow's) Attacks

Today's digital economy is built on a foundation of APIs that enable critical communications, making it possible to deliver a richer set of services faster to users. Unfortunately, today's security solutions focus on an outmoded way of thinking. Most current organizations deploy security solutions and practices that revolve around network security, intrusion detection and mitigating application vulnerabilities. However, for modern API-driven applications that have become the de-facto deployment model for applications that operate in the cloud, these traditional security practices simply do not scale to meet the challenges of today's organizations. Due to the incredible complexity of APIs, as well as the breadth and depth of their deployment across organizations, security and IT teams need to tackle this problem in a structured process that takes into account API application security best practices and procedures that constantly evaluate an organization's APIs, the level of their security posture and their ability to automate remediated security actions when they are attacked.


2022 will be the year we all start talking about online data collection

From uncovering trends to conducting market research, there are countless reasons why businesses collect publicly available web data from their competitors. Though the competitors in question often also engage in data collection themselves, most will regularly block access attempts and make site changes to prevent their public data from being accessed, even though the information targeted is on public display. All this could be about to change. While it may seem counterintuitive – after all, why would you want to give away information to your competitors – some businesses are beginning to realise that it’s in their best interests to allow their public data to be collected by responsible, well-defined, and compliant data practitioners. Firstly, preventing data collection is like a game of whack-a-mole: When you block one tactic, smart practitioners will simply find another. Secondly, accepting some forms of data collection will enable businesses to accurately distinguish between organic user traffic and collector traffic, giving them a clearer insight into what data is being collected and by whom.


Omnichannel E-commerce Growth Increases API Security Risk

API-led connectivity overcomes obstacles that retailers face gathering data from disparate systems to then consolidate the data into monolithic data warehouses. Since each individual system updates separately, information may be out-of-date by the time it hits the database. APIs enable retailers to build an application network that serves as a connectivity layer for data stores and assets in the cloud, on-premises or in hybrid environments. As a result, mobile applications, websites, IoT devices, CRM and ERP systems (order management, point of sale, inventory management and warehouse management) can all work as one coherent system that connects and shares data in real-time. ... The downside to this rapid growth and development in e-commerce has been a concerning rise in API security attacks. Here, threat actors have executed numerous high-profile breaches against public-facing applications. For example, developers use APIs to connect resources like web registration forms to various backend systems. This tasking flexibility, however, also creates an entrance for automated attacks.


Collaborative Governance Will Be The Driver of The API Economy

Most companies with API programs don’t have advanced API management tools, and they can only do a couple of releases a year from inception to production. Collaborative governance, with an automated platform, is the future to plug the gap from a business standpoint and help them get to market quicker and faster. A whole team would understand how APIs mature and prepare responses for the varying requirements. ... Collaborative governance democratizes the API building process as anybody in a team should be able to build, manage, and maintain APIs. Add a low-code, results-driven platform or AI-assisted development tools to the mix, and developers won’t always need to learn about new tools and technologies from scratch or interact with multiple parties. Through centralizing ownership using version-controlled configuration, enterprises can avoid the disruption caused by manual errors or configuration changes and enable reusability. Time to production is also reduced due to continuous integration and delivery (CI/CD). 


How AI helps essential businesses respond to climate change

Underpinning the AI-based forecast platform, is a convolutional neural network (CNN) model. This extracts features from radar reflectivity and meteorological satellite images. This is supported by a trained machine-learning model, which is capable of performing highly accurate and close-to-real-time local weather forecasting in minutes. Meanwhile, a generative adversarial network (GAN) works to generate forecast images with exceptional clarity and detail. One of the benefits of this AI-based prediction model, is that it outperforms the traditional physics-based model; for example, the Global/Regional Assimilation and PrEdiction System (GRAPES) requires hours to generate forecasting data, which is far behind the pace needed for organisations that need to make near real-time decisions based on anticipated weather events. Some of the data is conveyed via high-resolution imagery with one-kilometre grid spacing, with updates every 10 minutes providing fresh insights, enabling real-time decisions to be made to plans or arrangements based on unfolding or predicted weather events. 


Stargate gRPC: The Better Way to CQL

In 2008, Google developed, open-sourced, and released Protocol Buffers — a language-neutral mechanism for serializing structured data. In 2015, Google released gRPC (also open source) to incorporate Protocol Buffers into work to modernize Remote Procedure Call (RPC). gRPC has a couple of important performance characteristics. One is the improved data serialization, making data transit over the network much more efficient. The other is the use of HTTP/2, which enables bidirectional communication. As a result, there are four call types supported in gRPC: Unary calls; Client-side streaming calls; Server-side streaming calls; and Bidirectional calls, which are a composite of client-side and server-side streaming. Put all this together and you have a mechanism that is fast — very fast when compared to other HTTP-based APIs. gRPC message transmission can be 7x to 10x faster than traditional REST APIs. In other words, a solution based on gRPC could offer performance comparable to native drivers.


2022 promises to be a challenging year for cybersecurity professionals

One thing the pandemic has demonstrated is an unprecedented shift in endpoints, workloads, and where data and applications reside. Today, the Federal workforce remains mostly remote and telework is being conducted over modern endpoints such as mobile devices and tablets, and the applications and productivity tools are now cloud-hosted solutions. To be effective, those additional endpoints and mobile devices need to be included in the Agency’s asset inventory, the devices need to be managed and validated for conformance with the Agency’s security policies, and the identities of the user and their device must be known and validated. Additionally, the applications that are cloud-hosted must be included in the zero-trust framework including being protected by strong, conditional access controls, effective vulnerability management and automated patch management processes. I am optimistic that we can make great strides towards improving cybersecurity in 2022, if we are smart and pragmatic about prioritization, risk management, and leveraging automation to help us work smarter not harder.



Quote for the day:

"Making those around you feel invisible is the opposite of leadership." --  Margaret Heffernan

Daily Tech Digest - January 11, 2022

4 healthcare cloud security recommendations for 2022

Under the Health Insurance Portability and Accountability Act (HIPAA), cloud service providers aren’t considered business associates, which are entities that use or disclose protected health information (PHI). Companies that perform services such as claims administration, quality assurance, benefits management, and billing qualifies as business associates. That said, Chung encouraged healthcare organizations to push their CSPs to sign a business associate agreement, or BAA, to ensure that the provider assumes responsibility for safeguarding the organization’s PHI. “If a CSP is not willing to sign a BAA, then you have to ask yourself, “Do they treasure your data as much as you do?” Chung said. “The BAA provides assurance to organizations that we protect their data, that we provide training to our employees, and that we store and process consumer data securely.” Healthcare’s traditional network perimeter no longer exists. Many physicians and nurses may work at multiple locations for the same institution, sometimes visiting several locations in one day, or clinical staff may conduct research at a nearby university.


How To Implement Efficient Test Automation In The Agile World

In traditional environments, we have predefined builds that can be weekly, fortnightly or sometimes even monthly. One of the reasons is that these deployments take time. The problem with this approach is that we have to wait for the predefined dates to get the bugs fixed or to get the new features implemented, so there is a delay. The second reason was – by the time testers finish up with the testing and come up with bugs and defects, the programmers have moved on to different pieces of implementation and have less interest in resolving the bugs of the older application. This approach also delays the time for making the feature available in production. Building and deployments are the entities that are repetitive and sometimes boring.. ... Automating the testing behind the GUI is comparatively easier than automating the actual GUI. Another advantage is that irrespective of the UI changes, functionality remains intact. Even if some of the UI element is changed, the functionality of the feature does not change. This technique mainly focuses on the business logic and rules.


16 irresistible cloud innovations

The major public clouds and several database vendors have implemented planet-scale distributed databases with underpinnings such as data fabrics, redundant interconnects, and distributed consensus algorithms that enable them to work efficiently and with up to five 9’s reliability (99.999% uptime). Cloud-specific examples include Google Cloud Spanner (relational), Azure Cosmos DB (multi-model), Amazon DynamoDB (key-value and document), and Amazon Aurora (relational). Vendor examples include CockroachDB (relational), PlanetScale (relational), Fauna (relational/serverless), Neo4j (graph), MongoDB Atlas (document), DataStax Astra (wide-column), and Couchbase Cloud (document). ... Companies with large investments in data centers often want to extend their existing applications and services into the cloud rather than replace them with cloud services. All the major cloud vendors now offer ways to accomplish that, both by using specific hybrid services (for example, databases that can span data centers and clouds) and on-premises servers and edge cloud resources that connect to the public cloud, often called hybrid clouds.


Analytics transformation in wealth management

Early success stories are encouraging, but they are the exception rather than the rule. More often, firms have started the transformation journey but have faltered along the way. Common reasons include a lack of ownership at senior levels and budgetary or strategic restraints that prevent project teams from executing effectively. The challenges of transforming service models are significant but not insurmountable. Indeed, as analytics use cases become more pervasive, implementation at scale becomes more achievable. In the following paragraphs, we present five ingredients of an analytics-based transformation (Exhibit 3). These can be supported by strong leadership, a rigorous focus on outcomes, and a willingness to embrace new ways of working. Indeed, managers who execute effectively will get ahead of the competition and be much more adept in meeting client needs. ... Analytics-driven transformations are often restricted to narrow silos occupied by a few committed experts. As a result, applications fail to pick up enough momentum to make a real difference to performance.


Is Data Science a Dying Career?

Firstly, data science has never been about re-inventing the wheel or building highly complex algorithms. The role of a data scientist is to add value to an organization with data. And in most companies, only a very small portion of this involves building ML algorithms. Secondly, there will always be problems that cannot be solved by automated tools. These tools have a fixed set of algorithms you can pick from, and if you do find a problem that requires a combination of approaches to solve, you will need to do it manually. And although this doesn’t happen often, it still does — and as an organization, you need to hire people skilled enough to do this. Also, tools like DataRobot can’t do data pre-processing or any of the heavy lifting that comes before model building. As someone who has created data-driven solutions for startups and large companies alike, the situation is very different from what it’s like dealing with Kaggle datasets. There is no fixed problem. Usually, you have a dataset, and you are given a business problem. 


6 cloud security trends to watch for in 2022

More organizations are starting to fully adopt Infrastructure-as-Code (IaC) to create fully autonomous cloud-based environments. From a security perspective, ensuring that the supply chain from the code to production is protected and monitored is becoming an increasing concern for organizations. We are seeing tools in this space starting to mature, and new strategies are being implemented. For example, you can do things like pre-validation of configurations and architecture, ensuring your architecture and code are compliant and secured before it even moves to production. ... Multi-cloud strategies are here to stay – and many enterprises are picking technologies best suited for their platforms while also creating resilient architectures that utilize more than one cloud service provider. We will soon see this adoption model mature along with multi-cloud security practices and tools. Additionally, we see “multi-cloud” enveloping edge computing, which will continue to extend onto factory floors, as well as into branch offices and private data centers. 


How Low-Code Enables the Composable Enterprise

A composable enterprise aims to create an application architecture wherein enterprises can deliver various functions through composition (as against development), by leveraging packaged business capabilities (PBCs). Gartner estimates that by 2023, 30% of new applications will be delivered, priced, and consumed as libraries of packaged business capabilities, up from fewer than 5% in 2020. To be fair, this run-up to a composable enterprise is not a fresh-of-the-press revelation. Enterprises have been attempting to move from hardcore coding-based development to a more service-and-composition-oriented architecture over the last couple of decades, albeit only in pockets and not as fast as they would have wished. Composable enterprise has become the need of the hour. And this is being driven by the sense of urgency created by multi-faceted disruption across industries, coupled with technological advancements that make it possible for organizations to accomplish it at an enterprise scale.


10 Things Will Define the Digital Transformation in 2022

The reality of the digital shift is that consumers are no longer constrained by how far away something is: the item they want to buy, the service provider they want to engage, the employer they want to work for, the trainer they want to buff them up, or the concert or movie they want to watch. Or just about anything else they want to do. Technology is making once-physical interactions immersive digital experiences – sometimes complementing the physical world, and sometimes replacing the activities once done there. For businesses, this is both a threat and an opportunity – an undeniable dynamic driving the evolution of the connected economy. In retail. In grocery. In entertainment. In work. In banking. In just about everything — including many healthcare services. Proximity is no longer a barrier, and those who wish to make it a competitive advantage now have to one-up the digital alternatives that consumers find easier, more convenient and less wasteful of their precious time.


What is the role of the CTO?

The CTO role also entails effective management of risks, which are also changing all the time as the organisation innovates. Finding possible risks, and planning how to mitigate them as early as possible is particularly important for any digital transformation initiatives such as cloud migration. ...  “However, it is inevitable that as part of that migration there will be some component misconfigured, a vulnerability uncovered in a new technology, or a human error that introduces an unintended path to access a system. The CTO should understand the possible impacts a breach in a specific application could have to the business as a starting point, and then assess a difficult question – how likely is that risk to be realised? “As CTO, you must consider all the surrounding process and infrastructure needed to mitigate the security risks of an initiative. Are the assumptions you are making about the capabilities of third party vendors, and your own security organisation, accurate today and in the future? Perhaps the ROI won’t be quite as high if this is fleshed out in detail upfront, but that will be a far better result than being caught flat-footed after a production roll-out.”


How China's Algorithm Regulation Affects Businesses

Algorithm-powered recommendation services offer relevant suggestions for users based on their history of choices and are popularly used by video streaming services, e-commerce companies and dating apps. The CAC's regulation, however, is not confined to just search results or personalized recommendation algorithms that push e-commerce products. It also applies to dispatching and decision-making algorithms that are used by transport and delivery services and to generative or synthetic-type algorithms used in gaming and virtual environments, says Ed Sander, China tech and policy expert and co-founder of the technology and business website ChinaTalk, in a blog post. Companies that use algorithm-based recommendations are required to disclose service rules for algorithmic recommendations and periodically review, assess and verify algorithm mechanisms, according to the regulation. The new regulation also says that companies must ensure that their algorithmic models do not induce users to "become addicted, spend large amounts or indulge in activities that go against good public customs."



Quote for the day:

"A leader should demonstrate his thoughts and opinions through his actions, not through his words." -- Jack Weatherford

Daily Tech Digest - January 10, 2022

The positive side of ransomware for data transformation

The new architecture should be more responsive and interface with enterprise data via a smart API that can automate the processes by which data is fetched, merged, transformed, secured, and delivered, all without users having to put in a request to the data team. It must work with application data generated outside the data warehouse and programmatically combine it with data from either side of the wall, as well as with sources external to the organization. It must also be fully programmable, eliminating the need to predetermine exactly what data is going to be combined with what data. The “smarts” of this smart API means users can make data requests that may seem simple to them but that set off complex automated processes that deliver data within seconds or minutes, not days or weeks. One of the important functions of this API must be to perform automated backups on an optimized and context-aware schedule, including rapid backups to the cloud as well as slower backups to physical media in-house or remote. Such a system will enable enterprises to say, “No, thanks” to cybercriminals demanding payment to undo their ransomware’s damage on a company’s data.


How COVID-19 made zero trust the right approach to modernise networks

Zero trust networks operate under a principle of least-privilege, meaning that all programmes, processes, devices, or users are limited to the minimum privilege required to carry out their functions. The best analogy is to think of it like the government or military’s “need-to-know” policy. Access rights don’t need to be too restrictive as they are balanced against privileges that can range from ‘full access’ to ‘no rights at all’, depending on the conditions. Naturally, as remote working became the norm, these protocols made the most sense for dispersed networking environments. Though, many organisations may have quickly found that traditional detection and prevention technologies were insufficient to mitigate the threats posed by opportunistic actors on remote environments. While information into how previous attacks were carried out provided some context, it could only go so far with this new way of working in a highly distributed and decentralised way. The key for many was to plan for the worst and assume a breach had occurred.


Analyst predictions 2022: The future of data management

There’s a huge gap there in practice and knowledge. Also to a lesser extent, there’s a technology gap which is basically in the self-service technologies that will help teams essentially govern data; through the full life cycle, from develop, from selecting the data from building the pipelines from determining your access control, looking at quality, looking at basically whether the data is fresh or whether it’s trending off course. So my prediction is that it will receive the first harsh scrutiny this year. You are going to see some organizations and enterprises declare premature victory when they build some federated query implementations. You’re going to see vendors start to “data mesh-wash” their products, be it a pipelining tool, ELT [extract, transform, load process], a catalog or federated query tool. Vendors will be promoting how they support data mesh. Hopefully nobody’s going to call themselves a data mesh tool because data mesh is not a technology. We’re going to see one other thing come out of this. And this harks back to the metadata that Sanjeev was talking about and the data catalog.


How Artificial Intelligence is Changing the Payment Gateway Industry

Artificial intelligence and machine learning in the payment industry can reduce fraud detection significantly. An AI-powered payment gateway looks at a range of factors and provides a risk score. For example, suppose a merchant has a good record. It will be assigned a low-risk score. But, a time zone, location, or IP address that is previously not known will get a higher risk score. Several other factors are taken into account. Then, it is determined if the transaction is fraudulent. This way, it becomes easier to analyze data in a better way. AI can go through a larger volume of data and detect even unexpected anomalies. A financial institution or payment industry using AI can process real-time transactions. Technology is continually evolving; it is finding new advanced ways to differentiate between fraudulent and real transactions. Thus, AI provides the payment gateway industry with the ability to process a vast number of transactions with accuracy and lower error rates.


Eight resolutions to help navigate the new hybrid office model

The cloud and “on demand” models have become hugely important for helping users access the applications they need to do work from anywhere. While this shift to the cloud has its productivity benefits, it has not come without its security challenges. It is important to remember that cloud environments are not automatically secure when they are first created. Securing them requires knowledge and time. To keep business safe, security controls must span all environments – providing 360-degree application protection for both the application surface and the cloud application infrastructure. 5. Know your suppliers. The SolarWinds vulnerability highlighted the need for companies to thoroughly evaluate the tools and services they integrate into their operations. This includes the careful installation and configuration of the product or service, tracking patches and new releases from the vendor, and monitoring for any suspicious behavior. In a highly sensitive environment, some companies may choose not to use third-party products or services.


Avoiding the two-tier workforce in the third workplace

Most crucial of all, though, Farque believes, is for leaders and managers to consciously set the tone and role-model the behaviour they expect to see among their staff. Because dealing with hybrid teams equitably is not part of the usual management training rulebook, however, she advises investing in additional learning and development here. Mildon, on the other hand, believes there are lots of “little, everyday bits of inclusion” that can make a big difference too. These include simple things, such as ensuring that when remote employees dial into a meeting where others are physically present, they are invited to speak first. Alternatively, everyone could attend the event virtually. Digital tools, such as whiteboards and instant messaging, are also useful in enabling team members to communicate, collaborate and keep each other in the loop. But as Mildon points out, while there are lots of tools on the market that can help here. “It’s not necessarily about the technology,” he says. 


Why digital maturity is key to harnessing the Digital Twin

It’s a mistake to think that implementing a Digital Twin is a huge undertaking that will require the overhaul of technologies and processes. Fundamentally, so long as the business has data, and that data is captured, stored in an information management system, and understood by all, then you have the right foundation to get started. However, ultimately, a Digital Twin can only be successful if it is fed by the right data. And that is not only down to information management systems – it requires the workforce being able to read, understand, create, and communicate data as information. In other words, they need to be data literate. With the UK on the verge of a digital skills disaster, however, some business leaders may find that they’ll need to upskill their employees in this area. This will require the full backing of the C-level and a top-down leadership approach to encourage a data-driven culture that’s engrained across the company. As the saying goes, you can have all the gear but no idea – and organisations that upskill their employees in data will be the ones capable of bringing advanced concepts such as Digital Twins to fruition and reaping the benefits.


How Database Virtualization Could Break Vendor Lock-in

While the concept of database virtualization has started attracting attention and several products have become available, there is still a lot of skepticism and confusion. To be fair, the notion of virtualization has been used loosely in the context of databases. From data virtualization to the abstraction of database files, the label virtualization is used liberally. The concept of data virtualization has been around for decades and may warrant a clean delineation. Data virtualization is the idea of using a universal query language across all applications. While this seems beneficial in the long run, it requires all applications to be rewritten to use this artificial language. This does not solve the original problem: In order to overcome the vendor lock-in, the enterprise has to, well, overcome the vendor lock-in in the first place? Put differently, data virtualization is certainly aspirational, but it does not solve the actual problem of moving existing applications between databases.


2022 is shaping up to be an epic fight to protect data

COVID forever changed the way we work. Even in the highly unlikely event that the pandemic winds down this year, many people will keep working from home at least part of the time. Cybercriminals will be working, too. They'll be hunting for new ways to take advantage of the connections and devices that workers use to dial in remotely. NYU's Cappos says the cybersecurity industry will likely get a better handle on how to manage hybrid work situations, introducing new recommendations and products that boost security and make it easier for workers to connect. Consumers will also need to up their security game, Clay says. Good methods of two-factor authentication, such as biometrics and push notifications, are going to be a must. Simpler verification methods, like codes sent as SMS messages, just can't be trusted anymore. That goes for smartphones, too. Phishing, the practice of sending deceptive emails in order to get personal information, is going mobile. Similar attempts using SMS, known unimaginatively as smishing, and voice calls, which are called – you guessed it – vishing, will become more common this year as people move more of their online activity to mobile devices, Clay says.


Top tips to protect online networks in schools from cyber attacks

There’s a certain appeal that comes with free tools, especially if you have been looking at an alternative that is otherwise expensive to purchase. It’s easy to look at the free alternative and acknowledge that it isn’t exactly what you wanted, but it will do the trick. Avoid the temptation at all costs on a school set-up. An appealing free online tool could be dressed as malware or viruses, built to wreak havoc as soon as they have been downloaded. A lot of these come with Adware, which isn’t particularly dangerous but clocks up and slows your system and can showcase dangerous websites. Finally, there is the danger that a downloaded free tool will unleash spyware on your system, which integrates into your system and steals valuable data. ... Updating a school’s active directory, is quite possibly, the most basic, yet laborious, task on this list. Regardless, its importance is undeniable. This article has largely been about preventing online hackers, scammers, and cyber attackers from getting into your system. However, in some cases, those attacks can come from within, namely from former and previous students.



Quote for the day:

"Great leaders are almost always great simplifiers who can cut through argument, debate and doubt to offer a solution everybody can understand." -- General Colin Powell

Daily Tech Digest - January 09, 2022

Observability: How AI will enhance the world of monitoring and management

Observability is based on Control Theory, according to Richard Whitehead, the chief evangelist at observability platform developer, Moogsoft. The idea is that with enough quality data at their disposal, AI-empowered technicians can observe how one system reacts to another, or at the very least, infer the state of a system based on its inputs and outputs. The problem is that observability is viewed in different contexts between, say, DevOps and IT. While IT has worked fairly well by linking application performance monitoring (APM) with infrastructure performance monitoring (IPM), emerging DevOps models, with their rapid change rates, are chafing under the slow pace of data ingestion. By unleashing AI on granular data feeds, however, both IT and DevOps will be able to quickly discern the hidden patterns that characterize quickly evolving data environments. This means observability is one of the central functions in emerging AIOps and MLOps platforms that promise to push data systems and applications management into hyperdrive. 


The Fourth Industrial Revolution will be people powered

while there is a common perception that digitization and automation are a threat to the world’s workers, companies at the forefront of the technology frontier have actually created jobs—different, new roles that are much more high tech than the roles of the past. And with the current labor mismatch being felt in many countries, the time is now to further engage workers for a digitally enabled future. ... This focus is backed by growing research proving that workforce engagement is key. Over the last several years, research with the World Economic Forum, in collaboration with McKinsey, surveyed thousands of manufacturing sites on their way to digitizing operations and have identified about 90 leaders. These are the lighthouses—sites and supply chains chosen by an independent panel of experts for leadership in creating dramatic improvements with technology. Together they create the Global Lighthouse Network, committed to sharing what they’ve learned along the way. 


FarmSense uses sensors and machine learning to bug-proof crops

The impact of this technology is clear. For farmers tending to fields large and small, real-time information on insects would not only be important for their financial security, but would also allow them to potentially conserve and protect critical resources, such as soil health. But FarmSense claims it wants to empower rural farmers who they say are disproportionately impacted by the damages caused by insects. Yet $300 per sensor per season is stiff, posing a potential risk to adoption and, thus, to the tech’s ability to even solve the issue of insect damage in the first place. One of the most difficult things for small scale-farmers is managing risk, said Michael Carter, the director of the USDA-funded Feed the Future Innovation Lab for Markets, Risk, and Resilience and distinguished professor of agricultural and resource economics at UC Davis. “Risk can keep people poor. It disincentives investment in technologies that would raise income on average, because the future is unknown,” Carter said. “People with low wealth obviously don’t have a lot of savings, but they can’t risk the savings to invest in something that might improve their income that also might cause their family to starve.”


Why the road to stakeholder capitalism begins with diverse boards

While the pandemic cast the notion of stakeholder capitalism in sharp relief, part of this conversation points to historic shifts from tangible to intangible assets. In the past, markets and investors measured company value using conventional financial yardsticks developed for asset-intensive businesses. This approach, however, no longer captures the full value picture – both in terms of risk and opportunity – because today it is often a company’s intangibles that are the real drivers of value. Consider certain tech and software companies: what are their significant hard assets? The growing acceptance of ESG across the financial sector shows that investors are beginning to recognize the intangibles upon which we all depend. This newfound acknowledgment and pricing in of integral things that matter – including natural, human, and social capital – points to the interconnectedness of stakeholders and the need for boards to reflect that broader view. ... This change in mindset is not insignificant. 


Hackers Have Been Sending Malware-Filled USB Sticks to U.S. Companies Disguised as Presents

While it might seem ridiculous that anyone would plug a random USB stick into their computer, studies have shown that, actually, that’s exactly what a whole lot of people do when confronted with the opportunity. Thus the popularity of the “drop” trick, in which a malicious drive is left in a company’s parking lot in the hopes that the weakest link at the firm will pick it up and, out of curiosity, plug it into their laptop. Actually, if you believe one high-ranking defense official, a disastrous, worm-fueled attack on the Pentagon in 2008 was launched just this way. Hackers have also attempted to use USBs as a vector for ransomware attacks before. Last September, it was reported that gangs had been approaching employees of particular companies and attempting to bribe them into unleashing ransomware on their company’s servers via sticks secured by the hackers. All of this is a roundabout way of saying a few basic things: Don’t accept gifts from strangers, avoid bribes, and, if you don’t know where that USB stick came from, better leave it alone.


GM says Qualcomm’s computer chips will power its next-gen ‘hands-free’ driving mode

GM first announced Ultra Cruise during an investor event last year, describing it as a massive leap over the company’s Super Cruise system, which allows for hands-free driving on mapped, divided highways. In contrast, Ultra Cruise will cover “95 percent” of driving scenarios on 2 million miles of roads in the US, the company claimed. “We’re attempting to have this feature be sort of a door-to-door driverless operation,” said Jason Ditman, chief engineer at GM, in an interview with The Verge. “When the vehicle gets onto a capable road, Ultra Cruise will automatically engage and handle the majority of the work, hands-free. Stop signs, stoplights, turns, splits, merges, freeways, subdivision... all of those domains.” That’s thanks to Qualcomm’s new high-powered processors, Ditman said. Last year, Qualcomm entered into an agreement with GM to provide computer chips for the automaker’s next generation of electric vehicles. When it comes out in 2023, the Cadillac Celestiq will be one of the first vehicles to feature the chipmaker’s new ADAS platform, which includes Qualcomm’s Snapdragon SA8540P system-on-a-chip and SA9000P artificial intelligence accelerator.


France opens access to quantum computing to researchers, start-ups

The aim is to make this technology accessible to as many people as possible, including the scientific community and French and EU start-ups. The intent is to ensure France does not miss out on the major advances quantum computing could make in the decades to come. The platform will be installed at the High-Performance Computing Centre at the French Atomic Energy Commission (CEA). “By mid-2022, we will open a procedure […] for the purchase of two to three quantum hardware machines that are integrated into the platform,” said O, adding that two other calls for tender are planned over the next three years”. The platform has a total budget of €170 million and is part of the €1.8 billion national quantum strategy, inaugurated on 21 January 2021 by President Emmanuel Macron, who is keen to make this technology a major issue for France’s sovereignty, strategic superiority and independence. According to Paris, France wants to become one of the world’s leading powers in the field, but the intention is not to sideline its European neighbours. 


The real value of 5G and cloud computing

At the essence of cloud and 5G is that we can leverage cloud-based resources mixed with enterprise resources. Let’s face facts: We’re heading to an enterprise IT future of multicloud meets hybrid cloud meets edge computing meets complex and dynamic applications that run anywhere and everywhere. Thus, the future becomes less about the cloud and more about new and emerging ways to leverage all technology, cloud or not, that will be widely distributed and complex. The automation of 5G allows for the orchestration of systems in different network domains, be it your phone, desktop, TV, an enterprise server in a datacenter, or a public cloud provider. We’re quickly moving to not caring about where something runs but needing that something to migrate automatically to optimize how it runs and scales, cloud or no cloud. Also, this won’t happen unless we automate security and network provisioning as well. The bottom line is that 5G coupled with cloud computing can bring much more computing power to many more people and their companies.


Putting stakeholder capitalism into practice

The real issue is the trade-offs between short-termism and long-termism. Eighty percent of CFOs tell us in surveys that they would reduce discretionary spending on potentially high-NPV [net present value] activities like R&D and marketing to achieve short-term earnings targets. They are literally sacrificing the long term for the short term. Yet research shows companies that think long-term—meaning five to seven years ahead—substantially outperform, achieving 47 percent higher revenue growth over a 15-year period, for example. Stakeholder and shareholder interests do align in the long term. If you have happy employees, collaborative suppliers, satisfied regulators, and devoted consumers, then they will help you deliver higher benefits over a longer-term period. It is hard to satisfy everybody in the short term; you may have to make trade-offs, for example, between purpose and profit. But in the long term, we don’t believe this trade-off exists. The crystalizing concept here is purpose-driven ESG. Companies wondering how to deliver on the long-term stakeholder goals should start by asking the questions, “What is our purpose? What would the world lose if our company disappeared?”


AI is quietly eating up the world’s workforce with job automation

AI is also automating jobs in customer service, accounting, and a host of other professions. For instance, companies like Thankful, Yext, and Forethought use AI to automate customer support. This shift is often imperceptible to the customer, who doesn’t know if they’re speaking to a biological intelligence or a machine. The rise of AI-powered customer service has big implications for the workforce. It’s estimated that 85 percent of customer interactions are already handled without human interaction. According to the Bureau of Labor Statistics, there are nearly 3 million customer service representatives employed in the United States. Many of these jobs are at risk of being replaced by AI. When jobs like these are automated away, the question is: Where do the displaced workers go? The answer is not clear. It’s possible that many of these workers will be re-employed in other fields. But it’s also possible that they will become unemployed, and that the economy will struggle to absorb them.



Quote for the day:

"Little value comes out of the belief that people will respond progressively better by treating them progressively worse." -- Eric Harvey

Daily Tech Digest - January 08, 2022

E-Waste Is a Cybersecurity Problem, Too

Technology is so ubiquitous that this a societal problem we all have to reckon with. It’s much more serious than just affecting your family or your company. This is a problem of international magnitude, that has homeland security risks around it. That’s why we wrote the book: The vast majority of our clients still were not listening. They just wanted us for environmental work but they weren’t really sold on the hardware data destruction part of the work yet. We wanted to write this book to share some of examples of serious consequences—that this isn’t some remote, theoretical concern. ... What happens is that guy will pick up the devices for free, put them in a container, and sell them wholesale to the highest bidder. Lots of those buyers are harvesting the precious metals and materials out of old electronics — but there are also people adverse for homeland security who want to pull out the hard drives and find a way to harm us here in the U.S. or hold corporate data for ransom. From those examples you can see how you need to protect your financial and personal data on an individual level too.


Rethinking Cybersecurity Jobs as a Vocation Instead of a Profession

Within the cybersecurity industry, the prevailing mindset is that security practitioners are professionals. Thus, a direct consequence of this mindset is that a college degree is required for many cybersecurity jobs. A recent (ISC2) report indicates that 86% of the current cybersecurity workforce has a bachelor's degree or higher. Furthermore, a quick search on Indeed.com shows about 46,000 cybersecurity jobs, of which 33,000 (>70%) require a degree. However, many cybersecurity practitioners I know would rightfully argue that a college degree isn't needed to do most jobs in cybersecurity, and strict adherence to this requirement disqualifies many deserving candidates. But removing the requirement for a college degree raises the question: Are these actually professional jobs, or should they be recast as vocational jobs? I would argue that these jobs may need to be seen as vocations instead of professions. Although many cybersecurity workers take pride in their professional status, many of their jobs are really vocational in nature and could be filled by those with the appropriate level of vocational training.


AI could repair the damage done by data overload

To enable true knowledge collaboration and connect employees with the information they require, we must start using the data we have in organizations to draw conclusions, at scale. In doing so, we can connect people with questions to the right colleague(s) with the answer(s). Artificial intelligence has two additional important qualities that help businesses achieve this and overcome the issues with legacy knowledge management to date. First, AI can be taught to forget. This means that not only can AI identify who knows what about a topic, but it can also contextualize that information and recognize when information becomes outdated and redundant, meaning it can ‘forget’ unuseful data as needed. Second, using non-sensitive information drawn from existing tools, AI is able to see through silos. It can use all kinds of information to draw conclusions at scale, creating in one integrated platform a live map or ‘knowledge network’ of who knows what within an organization. In short, using data, AI can build a network of knowledge and expertise in real time. 


The Best Free RAW Photo Editing Programs in 2022

If you know Lightroom, you will have no problem navigating darktable. Like GIMP, darktable is also open-source. New functionality is added regularly which only increases the appeal. While by no means a beginner software, the interface is sneakily slick for a program with this much power under the hood. Adjusting contrast, brightness and saturation are a breeze, manipulated by simple sliders. The same can be said for achieving perfect shadows and highlights, modifying the graduated density of your image, or adding grain. Do not be fooled, though: just beyond those simple controls lies a wealth of robust tools for more advanced users ... RawTherapee is an open-source cross-platform photo editor that offers a non-destructive, 32-bit engine and utilizes powerful algorithms to help you develop the highest quality image possible. If GIMP is Photoshop, think of RawTherapee as Lightroom. While more useful as a processing tool in conjunction with another editing application, RawTherapee is still a perfectly functional editor in its own right, offering several features familiar to Photoshop users. 


Ethereum Devs Dismiss JPMorgan Warning About End of DeFi Dominance

Ethereum developers, for instance, have consistently stressed security over speed while making sure the network doesn't have any downtime. By contrast, the Solana network shut down for almost 18 hours in September because it was unable to handle high transaction volumes. Kline told Decrypt, "At the end of the day, chain security is incredibly important for financial transactions and for the foreseeable future Ethereum has the most security.” According to Kline, DeFi projects on other blockchains are "heavily driven by token incentives," meaning that people receive tokens that they can then trade or sell as a reward for participating. "Once Ethereum layer 2 adopts those same incentives, we are likely to see a lot more DeFi activity on Ethereum," she said. But the head of public affairs for Parity, which built Polkadot, believes developers are getting tired of waiting for Ethereum 2.0 to be fully ready. "The Ethereum sharding roadmap has changed so many times it is difficult to understand what is actually going to happen and when," said Peter Mauric. 


AI in closed-loop manufacturing can benefit edge computing systems: 4 things to consider in IIoT

Binghampton University Professor Sang Won Yoon explained this in detail: "With the rapid technology development, such as the Industrial Internet of Things, big data analysis, cloud computing, artificial intelligence, many manufacturing processes can be more intelligent, and Industry 4.0 can then be realized in the near future … . Data-driven solutions, such as AI and machine-learning algorithms, can be applied to diagnose abnormal defects and adjust optimal machine parameters in response to unexpected changes/situations during production. Smart manufacturing adopts real-time decision-making based on operational and inspectional data and integrates the entire manufacturing process as a 'unified framework.'" ... Imagine a series of closed-loop systems distributed at the enterprise edge that can "run themselves" in a closed environment, much like a mini-network. This could reduce present resource stressors, like challenges in managing and paying for large data payloads that continuously stream over communications lines to data centers and clouds.


Openshift Vs. Kubernetes: What Are The Differences

Like so many other technologies, what is better for some companies is not for others. Both platforms are excellent for building, deploying, and managing containerized applications. Kubernetes is great for intensive-use apps that require regular updates, like games. OpenShift may be the right option for security-strict, GDPR compliant, heavy-duty apps like those of institutional or governmental character, or healthcare. Self-hosted Kubernetes is more complex to install, manage, and monitor without third-party integrations. OpenShift seems to be an easier option to manage with its many built-in features,
but it is limited to Red Hat Linux distributions. At the core, OpenShift is built on a Kubernetes layer but brings additional features that make it a different flavor of container orchestration. Enterprises can benefit from the dedicated support provided by an OpenShift subscription. Still, Kubernetes may be the best option if companies have a skilled container orchestration team, avoiding subscription costs. Kubernetes and OpenShift are two excellent options. Do you know which suits your project best?


Kronos outage latest: Attackers crippled backup access

Multiple Kronos platforms have been unavailable since December 11. The outage has left millions of users at tens of thousands of customers unable to check pay, arrange rotas, or request paid leave. The issue has bedevilled IT teams globally who’ve been forced to spend time in early 2022 supporting their companies with Excel-based workarounds provided by UKG and other related HR/payroll issues. In the US public sector alone, the New York Metropolitan Transportation Authority, the City of Cleveland, the state of West Virginia, the Oregon Department of Transportation, the University of California system, and Honolulu’s EMS and Board of Water Supply, along with scores of smaller local authorities have been affected. ... Given these previous claims, many customers have been asking why restoration is taking so long. Asked why it was taking so long to restore customer data, the company said that it “employs a variety of redundant systems and disaster recovery protocols. In addition to several redundant data centers, UKG Kronos Private Cloud environments are backed up on a weekly basis, as well as on a daily basis with the delta from the previous day.


AI, ML bubble to the top in new beverage-ordering solution

A lot of the technology we’re using at Bees is at the bleeding edge of machine-learning research, which requires us to build advanced and custom machine learning systems. Out-of-the-box models and autoML systems like DataRobot are fantastic at democratizing access to machine learning and making it easy and inexpensive to deploy but are not well suited for places where a higher-performing model matters. Bees operates in 13 distinct markets, selling a complex product and customer portfolio, against a changing backdrop of shifting consumer preferences, price elasticity, and supply-chain shocks exacerbated in a post-COVID-19 macro landscape. For the use cases we’re tackling with the Bees team, the incremental impact of algorithmic selling is so significant, that it more than justifies the development and fine-tuning of advanced active learning models. That being said, we are huge fans of open source ML tooling and are power users of many of the biggest frameworks – e.g., PyTorch, Scikit-Learn, Pandas, etc. – pushing these tools as far as they can take us and filling in the gaps ourselves whenever it is necessary.


Why CIOs Should Report to CISOs

Digital transformation has put IT front and center in nearly every organization, which has made the job of protecting the infrastructure much more complicated. The growing importance of data as the lifeblood of business, the fundamental shifts in infrastructure with the emphasis on cloud and mobile computing, and the resulting target adjustments by cybercriminals and nation-state attackers has moved the spotlight away from the network. So, who should be in charge? Can cybersecurity responsibility be split up between the CIO and the CISO? Can they somehow share security duties? No, not effectively. To borrow the old phrase about starting quarterbacks in football, if you have two security chiefs, you really have no security chief. It’s time for businesses and other organizations to seriously consider having their CIO report to the CISO. ... The IT infrastructure – and more specifically the lack of visibility into it — is the biggest weak spot in enterprise security. We’ve gotten to a point where attackers know a company’s network better than the security professionals tasked with protecting it.



Quote for the day:

"Personal leadership is the process of keeping your vision and values before you and aligning your life to be congruent with them." -- Stephen R. Covey

Daily Tech Digest - January 07, 2022

Top 8 cybersecurity predictions for 2022

As organizations face new regulations, there will be a demand for resident compliance officers to help navigate through the complex and evolving dictates. “Compliance officers will certainly rise on the recruiters' agenda as regulatory bodies mandate there be a 'single throat to choke,'" says Liz Miller, an analyst from Constellation Research. “Although, that’s exactly the opposite of what we need. What we do need are skilled tacticians and strategies, which can be trusted voices and leaders within (and across) an organization, capable of translating the complexity of new (and constantly shifting) global regulations into real business value for everyone from the CEO to the mailroom.” ... With a major chunk of the global workforce forced to work from home by the pandemic, there is an upsurge in the usage of software that allows supervisors to monitor employees at all times. This has upset the remote working ecosystem to some extent, escalating employee distress. “Tattleware (also bossware) will degrade employee experience by 5% and increase insider threats in 2022,” according to a security prediction report by Forrester. 


Attackers Exploit Flaw in Google Docs’ Comments Feature

The campaign appears to signify a ramp up in attacks to exploit the Comments feature of Google’s collaboration apps for malicious intent – attacks that likely will continue if left unchecked, researchers said. June was the first time Avanan researchers identified threat actors hosting phishing attacks from within Google Docs, delivering malicious links aimed at stealing victims’ credentials. At the time, they identified it as a novel exploit of the app. Then, in October, as previously mentioned, researchers identified threat actors exploiting the Comments feature for the first time, followed by December’s flurry of attacks, which were reported to Google on Jan. 3 “using the resulting phishing via email through Google’s built-in tools,” Fuchs wrote. Avanan recommends that users cross-reference the email address in the comment to ensure it’s legitimate before clicking on a Google Docs comment. They also recommend standard “cyber hygiene” when reviewing comments, including scrutinizing links and inspecting grammar, according to the report.


Cybersecurity training isn't working. And hacking attacks are only getting worse

Part of the problem is that, for many businesses, cybersecurity isn't ingrained into everyday operations and employees are only asked to think about it when doing annual cybersecurity training -- leaving companies at risk from cyberattacks the rest of the year. "I think one of the most important things to realise is most of the education and training done, it's not very effective," Stuart E. Madnick, professor of information technology and engineering systems at MIT Sloan Executive Education told ZDNet Security Update. "The 30-minute video you're obligated to watch once a year doesn't do the job". According to Madnick -- who has been at M.I.T. since 1972 and has served as the head of MIT's Information Technologies Group for more than 20 years -- organisations need to build a culture of cybersecurity that actively involves everyone. If people have a greater understanding of how their organisation falling victim to a cyberattack could affect them, it could lead to everyone being more careful when it comes to cybersecurity.


How to Build Scalable Data and AI Industrial IoT Solutions in Manufacturing

The importance and growth of the IIoT technology stack can’t be underestimated. Validated by several leading research firms, IIoT is expected to grow at a CAGR of greater than 16% annually through 2027 to reach $263 billion globally. Numerous industry processes are driving this growth, such as automation, process optimization and networking with a strong focus on machine-to-machine communication, big data analytics and machine learning (ML) delivering quality, throughput and uptime benefits to aerospace, automotive, energy, healthcare, manufacturing and retail markets. Real-time data from sensors helps industrial edge devices and enterprise infrastructure make real-time decisions, resulting in better products, more agile production infrastructure, reduced supply chain risk and quicker time to market IIoT applications, as part of the broader industry X.0 paradigm, enables ‘’connected’’ industrial assets to enterprise information systems, business processes and the people at the heart of running the business.


FinTech in 2022: the Big Regression?

The next few years will see a Big Regression. It started in China when the $300 billion IPO of Ant Group was dismantled. Now it is clearly happening in Europe and America. What is the Big Regression? It is the reversal of technology progress. It’s not that people will reject technology. It’s governments and regulations. In fact, it is likely that FinTech will be affected by the Big Tech fall-out. As governments crack down on the FATBAGs, they will also have a magnifying glass on the FinTech unicorns and their brethren. They may like some of it but, specifically in FinTech, the Big Regression will be a thing as regulators dismantle some of the unregulated markets that they now want to regulate. We’ve seen it already with P2P lending stalling and, in some markets, failing. By way of example, Zopa has just given up P2P lending. Then we see it in crowdfunding, when Kickstarter is finding it hard to kick or start. And we can see the governments getting more and more of a squeeze on everything from BNPL to cryptocurrency exchanges and bitcoin mining.


Steering innovation toward the public good

Innovation should be shaped in ways that help America win the 21st century, with transportation systems and infrastructure that make communities more competitive, adaptable, and resilient. Our innovation strategy must support workers, knowing that our choices will help to define whether any given technological development meets its potential to create economic benefits for all. A good innovation strategy allows for experimentation and learns from setbacks, because these are indispensable parts of the scientific method that underpin all invention and discovery. Our approach to innovation should center on opportunities to collaborate, recognizing the distinct but related roles of the public, private, and academic sectors. And finally, our policies should be flexible and adapt as technology changes, because we can’t prepare for an evolving future with policy that only makes sense under present or past conditions. These principles will help us to ensure that the enormous potential of U.S. transportation innovation benefits our nation and its people. 


How is the role of the CTO evolving?

The modern CTO must instead be bold enough to become an early adopter — at least much of the time. Goldman Sachs and Bank of America are two brands that took the early adoption route with blockchain technology. Those brands and others have plenty of valid reasons for doing so. For example, according to one study, blockchain could reduce operational costs at banks by more than a quarter. Tristan Jervis, co-leader of the global technology team at Russell Reynolds Associates, also notes that bringing new technologies into the picture means taking a customer-centric view. He explains: “Business models are shifting to adapt to the needs of a more tech-centric, collaborative and agile organisation — and the role of technology leadership is evolving in tandem. “To make this happen, technology officer roles are moving from a supportive function to becoming key enablers of a company’s transformation agenda and strategy. In [business to consumer] and [business to business] companies alike, this means bringing the customer into the centre of everything they do, and understanding the entire customer lifecycle from front-end to back-end, rather than sitting in technology or digital silos.”

What Is NEAR Protocol? The Sharded Layer 1 Blockchain Explained

NEAR Protocol’s approach to solving the Blockchain Trilemma is by implementing a horizontal scaling feature called sharding. It works by splitting a blockchain node network into smaller partitions known as “shards.” Each shard consists of its own data and can be used to process transactions in unison with other shards, increasing the overall throughput of the network. Thanks to its shard technology, it claims it can process up to 100,000 transactions per second, outpacing other rival smart contract platforms by some distance. While sharding is not unique to NEAR Protocol (Ethereum also plans to implement sharding on the completion of Ethereum 2.0), it has built on the concept by developing dynamic re-sharding. On NEAR, instead of the number of shards being static, the network will regularly adjust the number supported based on user demand. By dynamically adjusting the number of shards used, NEAR can reduce the average cost of using the network while maintaining the potential for high transaction throughput. 


4 Robotic Process Automation (RPA) trends to watch in 2022

“Intelligent automation” has been an on-again, off-again buzz phrase in the RPA world. It’s on again, perhaps permanently. The term usually refers to a mix of technologies, including RPA, low-code and no-code development tools, and AI/ML. It also implicitly refers to the reality that RPA on its own is not “smart” – it can’t learn on its own (as with some ML models, for example) or adjust to things like UI changes without human intervention. Intelligent automation is often an aspirational vision of how more basic forms of process automation can complement more advanced cognitive technologies, and the other way around. To this point, the potential has outpaced the reality. The RPA market has been quite competitive rather than community-oriented, as has the broader automation and AI industry. But true intelligent automation will by definition require a collaborative approach, according to Jon Knisley, principal consultant for process and automation excellence at Fortress IQ. Knisley sees an increased focus on the need for a cooperative intelligent ecosystem in 2022.


Getting Rid of Wastes and Impediments in Software Development Using Data Science

Impediments and wastes have the following relationship: a waste could be an impediment, or could result in an impediment. An outdated manual, for instance, is not an impediment per se, but it could lead to a rework caused by some misdirection. On the other hand, waiting for some management approval is an impediment and waste at the same time. ... The result was a waste model that contained 22 wastes grouped into six categories and a map that demonstrated the relationship among wastes. The categories were: product, process, information and communication, product, process, and organization. The subsequent deliverables were a method and software that helped with the detection of wastes. Based on the information provided by the software, it is possible to understand what is going on, which wastes were found, and investigate the root cause of wastes. The next phase was the creation of a method that included an algorithm and software that implemented the algorithm.



Quote for the day:

"A leader does not deserve the name unless he is willing occasionally to stand alone." -- Henry A. Kissinger