Daily Tech Digest - May 25, 2021

What are the ingredients of digital transformation success?

Finding the right tech talent is a pressing issue for executives and a new study finds that the right talent is hard to come by regardless of how successful firms are with their enterprise modernization efforts. Successful firms recruit, invest and retain knowledgeable staff (71%) and work with trusted partners (76%) to compensate for whatever skill and culture gaps exist within their organization, according to the report, "Secrets of Successful Digital Transformation," by Forrester and global software consultancy ThoughtWorks. ... Decision-makers at successful organizations reported that a true cross-functional transformation process includes stakeholders from all parts of the organization such as IT, business, finance and more having involvement in the modernization initiatives, according to the report. "An effective modernization culture and strategy must include strong leadership, including support and guidance from executives and, perhaps most importantly, a dedicated budget to execute transformations,'' the report said. It also requires a monetary commitment. In fact, 71% of successful organizations fund their enterprise modernization programs through a dedicated digital transformation budget.


Staying Safe Online: 6 Threats, 9 Tips, & 1 Infographic

Sharing pictures of major life events or everyday moments on social media may seem fairly innocent. However, you should probably be more careful. Everyone has access to that information. Skilled cybercriminals have no trouble tracking down your relationships and other details about your life. They may use what they find to trick your friends into giving up sensitive information. It’s not hard to find out dates of birth, email addresses, interests, and details about family members, which makes it even easier for hackers to break into your account (see the first tip to avoid this!). ... Sometimes, website content may seem too appealing not to visit. You might even go ahead and create a profile, sharing your personal information. You should be careful, though, because not all websites are safe places. Who knows what malicious programs and scams are hidden there? Before doing anything, make sure you check the website address. URLs beginning with “https” are safer than ones with “http” because the letter “s” stands for security. Another thing to look out for is a small lock sign near the URL. Nowadays, web browsers are able to recognize safe websites and mark them as secure with this sign.


Return to Office Risks Worth Considering

"Organizations should have adjusted their business continuity and disaster recovery plans to account for the shift to remote work at the onset of the pandemic," said John Beattie, principal consultant at business continuity solution provider Sungard Availability Services. "These plans need to be readjusted again to account for employees being back in the office and any changes made to the IT environments as a result." Failing to tighten cybersecurity protocols upon the return to the workplace could leave networks vulnerable to cyberattacks and breaches. Additionally, failing to update the business contingency and recovery plans and failing to provide employees notice of plan changes could lead to outages or the inability to promptly act on contingency plans when the time comes, Beattie said. ... Ger Doyle, head of Manpower IT brand Experis and head of digital and innovation at ManpowerGroup, warns that companies moving toward a new, hybrid way of working must be careful to avoid a two-speed workplace in which those in the office get access to opportunities that work-from-home employees miss.


Five ways to use data to make better business decisions

Even though data might be digitized, it still may not be relevant for decision-making. If the data isn't valuable, it should be considered for elimination. Deciding which data to keep is a balancing act. There is data that isn't important today but could become valuable at a later date. However, there is other data (IoT 'jitter', for example, or memos about a company holiday party 20 years ago) that most likely won't ever be relevant to decision-making and should be eliminated. Master data management frequently focuses on normalizing or consolidating disparate data fields from different systems that refer to the same piece of information. However, there is also the need to aggregate unlike types of data, such as aggregating a weather report with photos or videos of a storm system. Data aggregation is most successful when business use cases are clearly identified, along with all of the data and data combinations that are needed for decision making. With the growth of citizen development and separate IT budgets in business user departments, it's more difficult for IT to know where packets of underexploited data might reside, and how to bring these data troves into a central data repository so that everyone in the business can use them. 


Dubai’s DMCC opens Crypto Centre to tap into blockchain's potential

The Dubai Multi Commodities Centre has set up a new space that will house companies developing crypto and blockchain technology. The Crypto Centre is the result of a partnership with Switzerland’s CV Labs, the organisation behind the Swiss government-backed Crypto Valley. It is part of the free zone’s own Crypto Valley – an ecosystem for cryptographic, blockchain and distributed ledger technology entities in the UAE. “This is a fantastic new development. Crypto and blockchain technology has enormous potential to transform global trade and supply chains ... and this aligns perfectly with the DMCC’s vision to drive the future of trade,” said Ahmad Hamza, free zone executive director at the DMCC. “Over the next few weeks and months, we will see this centre filled with [companies] ... looking to scale up their crypto businesses,” he said. He did not disclose the number of entities that DMCC expects to attract to the centre. The DMCC, which presides over companies involved in the trade of commodities that range from pulses to diamonds, registered 2,050 new companies last year, a five-year high for the free zone. 


Nikola Tesla: 5G Network Could Realize His Dream of Wireless Electricity

The experiments used new types of antenna to facilitate wireless charging. In the laboratory, the researchers were able to beam 5G power over a relatively short distance of just over 2 meters, but they expect that a future version of their device will be able to transmit 6μW (6 millionths of a watt) at a distance of 180 meters. To put that into context, common Internet of Things (IoT) devices consume around 5μW—but only when in their deepest sleep mode. Of course, IoT devices will require less and less power to run as clever algorithms and more efficient electronics are developed, but 6μW is still a very small amount of power. That means, for the time being at least, that 5G wireless power is unlikely to be practical for charging your mobile phone as you go about your day. But it could charge or power IoT devices, like sensors and alarms, which are expected to become widespread in the future. In factories, for instance, hundreds of IoT sensors are likely to be used to monitor conditions in warehouses, to predict failures in machinery, or to track the movement of parts along a production line. 


Understanding AI cloud

The most compelling advantages of AI cloud are the challenges it addresses. It democratises AI, making it more accessible. By lowering adoption costs and facilitating co-creation and innovation, it drives AI-powered transformation for enterprises. The cloud is veritably becoming a force multiplier for AI, making AI-driven insights available for everyone. Besides, though cloud computing technology now is far more prevalent than the use of AI itself, we can safely assume that AI will make cloud computing significantly more effective. AI-driven initiatives, providing strategic inputs for decision-making, are backed by the cloud’s flexibility, agility, and scale to power such intelligence massively. The cloud dramatically increases the scope and sphere of influence of AI, beginning with the user enterprise itself and then in the larger marketplace. In fact, AI and the cloud will feed off each other, aiding the true potential of AI flower through the cloud. The pace of this will depend only on the AI expertise that enterprises can bring to bear in their workplace activities, for the cloud is already here and seeping everywhere.


Navigating the benefits and challenges of network and security transformation

Obtaining any necessary board-level buy in for transformation projects is only half the battle. Any significant change project will require consideration of the organisational arrangement and availability of specialist skills. The Netskope/Censuswide research found that 50% of global CIOs believe that a lack of collaboration between specialist teams is stopping them from realising the benefits of digital transformation projects. For context, assuming that 50% of CIOs are responsible for 50% of the $6.8 trillion digital transformation spend IDC predicts, then we are looking at a situation where a spend equivalent to the entire annual US tax income is in jeopardy because teams are failing to work together effectively. ... The researchers discovered that while just under half of security and networking teams report to the same boss, 37% of participants stated that ‘the security and networking teams don’t really work together much’. In fact, nearly half of the networking and security professionals described the relationship between the two teams as ‘combative’ ‘dysfunctional’, ‘frosty’ or ‘irrelevant’. They all agree that this imperfect relationship has the potential to derail huge plans.


Real estate tech takes on the housing boom in this seller's market

"Proptech is most important in cities where there is a large transient population, places that have a strong presence of universities, hospitals and a strong job market," said Blum. "This includes major cities like Philadelphia, Boston, San Francisco, NYC, Houston, Chicago, Miami. These cities have already begun to rebound quickly. I've spoken to agents from major cities across the country and they all say the same thing that anything that can help save them time with their business is greatly welcomed." The new platform Localize "harnesses the power of AI [artificial intelligence] to provide a cutting-edge experience for homebuyers and brokers," explained Omer Granot, Localize president and COO. "[W]e streamline the house-hunting journey through" property insights and "our concierge texting service, Hunter by Localize." Hunter curates properties specifically for each homebuyer through its "Smart Matching technology." It uses more than 100 data insights that are associated with a listing as well as a homebuyer's specific preferences to send daily recommendations to prospective buyers "to find them the perfect home. 


Ethical Decisions in a Wicked World: the Role of Technologists, Entrepreneurs, and Organizations

"Wicked problem" is a term introduced by the theorists Rittel and Webber (1973) to describe problems that cannot be definitively described, with no "solutions" in the sense of definitive and objective answers. It is also understood as a super-category of "complexity", problems that overwhelm us in some sense. There is also a class of "super-wicked" problems: climate change, poverty, food security, energy supply, education policy and public health. They all have many interdependent factors making them seem impossible to solve. The software industry faces wicked problems in different ways: by developing complex software systems and by managing them as part of a larger social, economic, and environmental fabric. Wicked problems always existed in our industry, but the internet and globalization undoubtedly created conditions for new forms of interaction, thus expanding the universe of related wicked problems. Examples of wicked problems closely associated with software are: social networks, sharing economy platforms, and air traffic control. In business, a new strategy (e.g. re-branding) or a modification in a product (e.g. introducing a new version of a video game) are classic examples.



Quote for the day:

"No great manager or leader ever fell from heaven, its learned not inherited." -- Tom Northup

Daily Tech Digest - May 24, 2021

How can banks mitigate the risks of consumers’ poor cyber hygiene practices?

To successfully implement adaptive authentication, banks and financial institutions must implement robust risk analytics – a sphere in which AI is playing an increasingly large role. This is no surprise, given that the threats to banks are becoming more sophisticated, with the emergence of attacks-as-a-service, automated attack tools, and close collaboration amongst bad actors enabling fraud at an unprecedented scale. An AI-powered decision engine and machine learning model can continuously analyse a broad range of data, events and context. Rather than simply detecting login and transaction data, they look at a whole variety of indicators of compromise and learn from them. These include malicious headers, referrers from a phishing site, malicious cookies, a malicious device or IP, inhuman speed, keyboard overlay, a debugger running and many more. Based on the risk level of each user action, a smart risk analytics solution can generate a score and provide a recommended next step in real time – enabling banks to remain proactive, rather than reactive. So, with the complexity of attacks growing and fraudsters’ sophistication evolving on an almost a daily basis, it’s clear that users cannot and should not be expected to keep up.


The Future State of the Cloud

If findings from a decade’s worth of research are predictive, these technologies and tools are the ones that may experience increased adoption: Configuration management tools: Growth of configuration management tools is on the rise, though this varies among tools; many organizations now use more than one. Back in the early days of this research, Chef and Puppet ruled the roost, each peaking as high as the 50% adoption mark among enterprises in the 2019 report. It was at this point that we started to see increased experimentation with Ansible and Terraform, each of now adopted by more than one-third of all enterprise respondents. This coincides with Puppet and Chef experiencing significant decreases in adoption, with even fewer organizations planning on using and/or experimenting with these tools;  Platform-as-a-Service (PaaS): Recently, there is continued experimentation and increased adoption of public cloud PaaS services. These include data analytics, artificial intelligence and machine learning (AI/ML), and the internet of things (IoT). ... Increasingly, today’s industry relies on services such as these that are becoming standard parts of operations.


The RPA world desperately needs standards

The absence of RPA design standards capable of detailing process automations in a universally understood manner is also a major contributor to stalled automation pipelines. Look, for example, at process discovery tools, a key component of any automation toolchain. Without RPA standards that would assure compatibility and interoperability, process discovery tools detail discovered processes in different ways. This leaves RPA users with little choice but to transcribe processes manually before they can ever start to be developed and deployed in target automation platforms. As a result, automations stall, more money has to be spent, and more time is wasted. Growing awareness of these standardization issues, coupled with the inability of RPA to scale or deliver on anticipated ROI, is causing many companies to rethink additional automation investments. ... To better understand the kind of incredible impact industry standards can have, look at the example provided by the Portable Document Format, or PDF. After the PDF was released as an open standard by Adobe, the ability not only to save a PDF in any word processor, but also to open it in another tool suddenly unlocked a level of portability that previously had been impossible to attain.


5 Strategies to Infuse D&I into Your Organization

The CEO needs to take a public stance, embed D&I in the organization’s purpose, exemplify the culture, and take responsibility for progress toward goals. They need to be out front, even if a CDO is part of the team. PwC’s U.S. chairman, Tim Ryan, has been an exemplar for at least five years. He co-founded CEO Action for Diversity and Inclusion after police shootings in the summer of 2016 to spur business executives to collective action on D&I. The publication of PwC’s workforce diversity data in 2020 revealed that women and people of color are underrepresented, especially at senior levels, showing that even the most dedicated companies still have a lot of D&I work to do. Nielsen’s CEO, David Kenny, added the CDO title to his leadership portfolio in 2018 so he could “set hard targets for ourselves and make those transparent to our board and measure them like we measure other outcomes like financial results.” He relinquished that title to a new CDO in March 2020, noting the D&I progress his team had already made. If you’re a board member, you have an essential role to play in D&I governance.


Explainable AI (XAI) with SHAP - regression problem

Model explainability becomes a basic part of the machine learning pipeline. Keeping a machine learning model as a “black box” is not an option anymore. Luckily there are tools that are evolving rapidly and becoming more popular. This guide is a practical guide for XAI analysis of SHAP open source Python package for a regression problem. SHAP (Shapley Additive Explanations) by Lundberg and Lee  is a method to explain individual predictions, based on the game theoretically optimal Shapley values. Shapley values are a widely used approach from cooperative game theory that come with desirable properties. The feature values of a data instance act as players in a coalition. The Shapley value is the average marginal contribution of a feature value across all possible coalitions. In this guide we will use the Boston house prices dataset example from sklearn datasets. It is a simple regression problem. ... The SHAP framework has proved to be an important advancement in the field of machine learning model interpretation. SHAP combines several existing methods to create an intuitive, theoretically sound approach to explain predictions for any model.


Super-Secure Processor Thwarts Hackers by Turning a Computer Into a Puzzle

To stop attacks, Morpheus randomizes these implementation details to turn the system into a puzzle that hackers must solve before conducting security exploits. From one Morpheus machine to another, details like the commands the processor executes or the format of program data change in random ways. Because this happens at the microarchitecture level, software running on the processor is unaffected. A skilled hacker could reverse-engineer a Morpheus machine in as little as a few hours, if given the chance. To counter this, Morpheus also changes the microarchitecture every few hundred milliseconds. Thus, not only do attackers have to reverse-engineer the microachitecture, but they have to do it very fast. With Morpheus, a hacker is confronted with a computer that has never been seen before and will never be seen again. To conduct a security exploit, hackers use vulnerabilities in software to get inside a device. Once inside, they graft their malware onto the device. Malware is designed to infect the host device to steal sensitive data or spy on users. The typical approach to computer security is to fix individual software vulnerabilities to keep hackers out. 


Cybersecurity is Now Essential to Corporate Strategy. Here's How to Bring the Two Together.

Compliance is not security. This is an essential difference to understand. Compliance is about checking the same processes to meet some pre-established requirements and procedures. Security is about continually monitoring for new and unexpected vulnerabilities. The best way to think of this important difference is as though there is an (ideally) impenetrable net covering every component of your business. Compliance checks the state of that net at a moment in time and from an established list of criteria, but it isn’t checking for a continually growing set of new threats that are not yet on the list. Security requires ongoing vigilance for unexpected vulnerabilities. It’s very much a real time and continuous effort. When it comes to cybersecurity planning, the lesson for businesses is that following established processes is not enough. It’s about anticipating what could happen or what could possibly go wrong. Security is like an ongoing and engaged state of being — it needs active and ongoing vigilance and maintenance to remain operational and be ready to pivot when the expected happens.


Can Your Enterprise Benefit from No-Code AI?

There are many ways no-code AI can be used in businesses, including small businesses looking to find ways to embrace the power of automation. Here are just a few examples of how no-code AI is impacting different industries. Several financial services firms have started incorporating no-code AI into their workflows to improve security and provide an enhanced customer experience. By using no-code AI, the entire customer experience can be streamlined. Let’s take an example of a loan application. Using no-code AI, financial services teams can build an ML model to quickly scan loan applications and determine which ones meet the required criteria. The underwriting team now has more time to focus on approved applicants instead of spending all their time sifting through applications. As different teams need new ML models to improve their processes, they can use a no-code AI platform to create them. This makes their operations more efficient because they no longer need to wait for their IT team or data scientists to develop a new model every time a need arises.


Blockchain, when and why to use it in business processes

The feature of intrinsic disintermediation and crystallization of traceability of the transferred asset are among the most innovative requirements of blockchain technology, which has and will have increasing impacts on the evolution of social and organizational models, as well as positive impacts in terms of technological process innovation. Service providers can interface with the blockchain to offer advanced functionality to users, for example API integrations services. ... Blockchain makes it possible to track when and by whom a given change was made, which is why blockchain technology is spreading in all scenarios where it is required to ensure traceability and authenticity for a product or service, such as the agri-food supply chain. In addition, another widespread application is that of notarization or crystallization of data on blockchain, which ensures the association of a certain date. Another application on which various projects and concrete initiatives have focused is that of smart contracts, i.e. the automatic activation, based on distributed ledger software technologies, of contracts between private individuals upon the occurrence of certain events or conditions predefined by two or more parties.


AI is no villain: six steps to build your AI strategy

During AI transformation projects, companies often make the mistake of separating the vision from the execution, resulting in disjointed and complicated AI programs that can take years to consolidate. This can be easily avoided by choosing AI solutions based on concrete business objectives that have been established at the project’s outset. It’s important to align your corporate strategy with measurable goals and objectives to guide your AI deployment. Once complete, the strategy can be easily escalated down into divisional- or even product-level strategies. ... Identify the real problem; don’t assume it is AI. This might seem like common sense, but the problems you’re looking to overcome have a large impact on your success. Some problems are not AI problems at all, and for the ones that are, the business should advocate the delivery through small lighthouse projects that act as a beacon for their capabilities. In identifying ‘lighthouse’ projects, your business will need to assess the overall goal and importance of the project, its size, likely duration and data quality. Lighthouse projects tend to be able to be delivered in under eight weeks, instead of eight months, and will provide an immediate and tangible benefit for the business and your customers.



Quote for the day:

"Time is neutral and does not change things. With courage and initiative, leaders change things." -- Jesse Jackson

Daily Tech Digest - May 23, 2021

Qualcomm reveals tiny Linux-driven 5G NR chipset for IoT

The 315 5G chipset offers up to a 1.54Gbps data rate for 5G (3GPP Rel 15), while the 4G mode goes to 400Mbps. Other features include antenna tuning support and a dual-frequency GNSS location capability. The combination of 7nm technology, Cortex-A7, and an efficient RF front-end design enables up to 50 percent smaller modules than existing models, claims Qualcomm. Vanghi also touted the chipset for its low power consumption and extended life maintenance through 2028 to 2030. The Qualcomm 315 5G IoT Modem-RF “can be easily fitted onto industrial machines,” said Vanghi. “You can bolt it directly onto the chassis using existing holes.” The small size will also make it easy for wireless module manufacturers to upgrade existing 4G modules, said Vanghi, mentioning support for 35 x 40mm module footprints. “The 315 is a pin-to-pin compatible solution for LTE legacy modules,” he added. Vanghi noted that the chipset has all the security features of Qualcomm’s premium 5G chipsets for smartphones, which would include the Snapdragon X55 5G Modem-RF System. Security features include hardware-based cryptography, TrustZone, Qualcomm TEE, secure boot, secure storage and key provisioning, and debug security.


Keeping Technology Change Human

When users can accurately predict their efficiency with a tool, even when that tool itself is inefficient, they can strongly resist a change. I worked on an inflight commerce system, and our solution required a series of reconciliation steps to be taken at the end of a flight. The crew instinctively know how long this process takes through repetition of the process, and they set aside that time - at what is generally a very stressful point in the flight. Coming in to land is when everyone suddenly wants to be out of their seat! Changes to the software (and hence the process) around reconciliation were always difficult to achieve buy-in for, because the nervousness around trying something new at such a critical point in the crew's operational life was always a tough sell. Our software was one of the multiple tasks taking place at that time, and a change to one can lead to underperformance in any of the others. Nobody wants distracted staff on a plane. Changes to software or processes mean a risk to their ability to deliver predictably to the business, and that could have catastrophic consequences for a user's role.


The future of the IoT (batteries not required)

When the two technical co-founders looked to expand their startup, they tapped a collection of their newly minted PhD students who had the expertise of developing wireless system-on-chip technologies in the lab. Today, Everactive has expanded into a team of nearly 90 industry veterans and technical experts, including talented minds like Alice Wang, who joined up with Calhoun and Wentzloff in 2018 after successful stints with industry giants Texas Instruments and MediaTek. Another MIT alum, she now serves as VP of hardware for Everactive, directing both silicon and hardware systems design. “We’re exceptionally proud of the team that we’ve developed,” says Wentzloff. “I think a large part of why we continue to succeed is that we’ve done a great job of surrounding our core technology students with a broad set of talented industry leaders.” Thanks to their advances in ultra-low-power circuits and wireless communication, Everactive sells full-stack industrial IoT solutions powered by their always-on Eversensors, harvesting energy exclusively from the surrounding environment. The sensors can be deployed at a larger scale than battery-powered devices, and they cost less to operate.


Can Nanotech Secure IoT Devices From the Inside-Out?

Sowder said that many times, “the challenge with these IoT devices is the limited compute capability that they have on them. An IP camera can’t run a full IPS protection suite against traffic to it. It has a job to record video and send it upstream.” He pointed to the potential solution of nanotechnology: Specifically, the concept of a nanoagent on each IoT node that inspects firmware code to determine if it’s engaged in malicious behavior, such as memory corruption. If so, the nanoagent can block it in real-time. The challenge is how to do it with a small footprint, Sowder said: “A lot of devices don’t have a lot of compute. Sticking a firewall in front of every IP camera simply isn’t feasible. The solution is a very, very slight agent. It phones home to get a device signature, including what kind of device it is and what can run on it.” Nanoagents don’t put a lot of overhead on these devices, so the devices’ performance isn’t slowed down, Sowder noted: “There’s no overhead to prevent them from performing their functions.” Check Point has been working on a lightweight agent that relies on a cloud instance to pull down specific protection details related to that device.


How Mirroring the Architecture of the Human Brain Is Speeding Up AI Learning

Several decades of neuroscience research suggest that the brain’s ability to learn so quickly depends on its ability to use prior knowledge to understand new concepts based on little data. When it comes to visual understanding, this can rely on similarities of shape, structure, or color, but the brain can also leverage abstract visual concepts thought to be encoded in a brain region called the anterior temporal lobe (ATL). “It is like saying that a platypus looks a bit like a duck, a beaver, and a sea otter,” said paper co-author Joshua Rule, from the University of California Berkeley. The researchers decided to try and recreate this capability by using similar high-level concepts learned by an AI to help it quickly learn previously unseen categories of images. Deep learning algorithms work by getting layers of artificial neurons to learn increasingly complex features of an image or other data type, which are then used to categorize new data. For instance, early layers will look for simple features like edges, while later ones might look for more complex ones like noses, faces, or even more high-level characteristics.


Developer burnout and a global chip shortage: The IoT is facing a perfect storm

Part of the problem stems from unprecedented demand for IoT devices. There are already more connected things than people in the world, and the trend isn't showing any sign of slowing down. In fact, it's quite the contrary: tech analyst company IDC recently estimated that there will be a total 41.6 billion connected devices by 2025. Consumers are particularly interested in using smart products in their homes – think connected plugs, lightbulbs, thermostats and even fridges. Forrester forecast that by 2025, the average US household will have 20 internet-connected devices. In this context, it won't be enough for manufacturers to produce more of the same old things. Buyers' expectations are growing: they want easy-to-use devices with new, exclusive features, which will be continuously improved; and crucially, consumers expect that their connected products work together across different platforms and operating systems. More than eight in ten respondents to Forrester's survey said that they need to rapidly manufacture new smart products and services to maintain or grow their market position – meaning, in most cases, that a new cycle of research and development is necessary.


Developing an api architecture

There is almost always a 1:1 relationship between the api layer and the application layer. An api endpoint will only call one usecase, and a usecase will most likely only be used by one api endpoint. Why not just combine them into a single function? Loose coupling. For example, although I am using express for my server, I may want certain use cases to be accessed via a CLI instead/as well. The application layer does not care if a request comes via the web api, or the cli, or some other method. It just cares about the arguments it receives. The application, core, and infrastructure layers are hard to talk about in isolation (which is ironic) so the next few sections will be a bit intertwined... How does the application layer actually "do stuff" though? If we want to get the basket, for example, how does it do this? We wouldn't want the application layer to import the database and query it directly, this would couple our low level implementation too tightly to the high level use case. The core layer holds interfaces for all of the things the application can do. When I say interfaces, I mean typescript interfaces, there is no actual javascript here, purely types and interfaces.


4 Of The Fastest Growing Cyber Security Skills In-Demand By Business In 2021

Application development security is analyzing vulnerabilities in the app, developing and adding security features to protect it from hackers. As the field of modern software development catch up speed, more threat actors exploit the rapid production of application as a chance to attack vulnerabilities in your code. Fortunately, there are application development security experts to protect your data and digital assets from a hacker. Application security is no more an afterthought. To build a secure application, one must integrate security measures in all software development life cycle parts. Burning glass report makes this evident with demand in Application development security skills to increase 164%, topping the list among other cybersecurity skills. ... Cloud security refers to all the measures, policies, and rules implemented to protect the data in the cloud from hackers. On account of businesses making a shift to the cloud, robust cloud security is necessary. Security threat is continually evolving and becoming more complex, which means cloud computing is at no less danger than the on-premises environment.


What’s next: Machine learning at scale through unified modeling

Model unification can be useful for many types of machine learning problems. Our experience with predictive models, which are widely used by organizations across industries, has shown three important conditions that should be met for taking a unified modeling approach: A prediction is needed for the same target variable across a large number of related entities, or partitions; Each partition uses the same set of features; The models need to be refreshed on a frequent basis. ... With unified models, teams lose some flexibility for addressing problems since it is not possible to pick and choose individual partitions to roll back (or roll forward). A team can address this issue by retraining the unified model outside of the regular refresh cycle. Alternatively, if necessary, the model can be reverted for all partitions at once, across the board. For example, if you’ve created a unified model to predict demand for the full range or a set of your company’s products, you may find, after deploying the model, issues with the results for one product. You will then need to either roll back or retrain the full model.


Digital Transformation: The value of intelligent operations versus business process outsourcing

Companies that have adopted intelligent operations within their processes are viewed as moving up their operational maturity level from “stable” to “efficient”. Once operational efficiency is achieved, the next step is to include data-driven insights into the decision-making process, putting the companies at the “predictive” maturity level. Companies that go beyond this stage are called “future-ready”. In these companies, artificial intelligence, blockchain, cloud and various forms of intelligent operations are used to drive and grow the company. According to the report, only 7% of organisations globally fall into the “future-ready” category, and these are mostly in the insurance and high-tech sectors. On average, “future-ready” organisations showed a 2.8 times boost in corporate profitability and 1.7 times increase in operational efficiency compared with companies in the lower maturity levels. Accenture Operations associate director Pankaj Jain says when new technologies are introduced, the way a company runs its operations changes dramatically.



Quote for the day:

"What lies behind us & what lies before us are tiny matters compared to what lies within us." -- Ralph Waldo Emerson

Daily Tech Digest - May 22, 2021

Universities Are Failing Software Developers

First, universities need to re-examine their curricula -- and do so often, because technology, trends, and best practices move lightning-fast in our industry. You would think that the ever-evolving nature of software development is common knowledge, yet year after year, I meet with candidates who only know Python, Java, or C++. These coding languages are often taught because of existing school material, exercises, tests, and labs, but they aren’t as widespread in professional settings because, frankly, there are better languages with larger communities targeting a larger set of applications or devices. At my company, for instance, we prefer to primarily work with Typescript/Javascript, C#, and PHP, all of which come with great frameworks and libraries. In theory, software development or computer science is a very practical university major, with many obvious applications available immediately after graduation. But if universities want this to be true in practice, they need to do a much better job of teaching real, marketable skills that employers actually value. In addition to updating the hard skills being taught to students, university leaders need to emphasize the importance of softer skills like critical-thinking, problem-solving, communication, and project management. 


Cybersecurity And The Vaccine Passport: A Dream Ticket Or A Flight Of Fancy?

There is currently no clearly defined standard for the vaccine passport. The Biden administration has announced that there would be no central, national policy, leaving the private sector to create its own solutions. Many projects — including those of IBM, the International Air Transport Association and several individual airlines — are already underway. Depending on where you intend to travel, this could mean handing over personal information and login credentials to multiple airlines and industry bodies. The more places that store this information, the more vulnerable it is to breach and loss. This lack of any agreed standard also opens the system up to fraud and manipulation. Cybercriminals are already working the impact of the pandemic to their advantage, and a patchwork of vaccine passport systems presents another golden opportunity. Because of the emphasis on equity for rollout, vaccine passports will have to be both paper and digital. For those who do not have digital devices, paper passports will show up on the dark web — the same way that we see fake vaccine passports already showing up for a few hundred bucks.


Cybersecurity, emerging technology and systemic risk: What it means for the medical device industry?

A personal goal of mine is that within 5 years, I can talk to any medical device developer about cybersecurity and find that they have comprehensive knowledge of all aspects of creating a secure device. To achieve that, I partnered with Axel Wirth to write and publish the world’s first comprehensive, how-to book on medical device cybersecurity. Also, Velentium has launched a training certification process to train engineers, developers, and managers at medical device manufacturers (MDMs) and other embedded and IoT device designers, so they’ll have qualified, knowledgeable cybersecurity expertise on-staff. According to a recent (ISC)² report, the global cybersecurity talent gap remains at more than 3 million. Cybersecurity employment must grow by 41 percent in the U.S. and by 89 percent worldwide to fill the existing gap. Clearly there is huge shortfall of talent in the IT arena, but the situation is far worse for the embedded device arena. Skilled people simply are not available.


Hacker's guide to deep-learning side-channel attacks: the theory

A side-channel attack is an implementation specific attack that exploits the fact that using different inputs results in the algorithm implementation behaving differently to gain knowledge about the secret value such as a cryptographic key used during the computation using an indirect measurement such as the time the algorithm took to execute the computation. One of the most infamous cases of timing attack, is the fact that the time taken by the naive square-and-multiply algorithm used in textbook implementation of the RSA modular exponentiation depends linearly on the number of "1" bits in the key. This linear relation can be exploited by an attacker, to infer the number of "1" bits in the key by timing how long it takes to perform the computation for diverse RSA keys. He can then use this knowledge to guess the number 1 in an unknown RSA key stored in a hardware crypto device by simply measuring how long the code takes to run. While nowadays most hardware crypto-implementation have constant time implementation, timing attacks are still actively used, mostly in blind SQL injection.


JavaScript API to Recognize Humans vs Bots in Chrome

Privacy Pass, an open-source web extension, was the step towards the right direction, keeping privacy at its core. It helps to bypass CAPTCHA challenge repetition by using a set of Tokens/Passes. Let’s look at how it works. Users have to download the Privacy Pass extension for Chrome/Firefox web browser. You can see the Privacy Pass icon; Visit the CAPTCHA website and answer the CAPTCHA challenge, which grants 30 Tokens/Passes; These tokens are stored in the extension for future use. The concept is simple when the user visit’s another page, the Privacy Pass extension issue these Token/Passes. And the great thing here is that each of these Token/Passes goes through a cryptographic process known as “blinding” that shields users’ privacy. ... Google has recently started developing a Trust Token API. It was developed as a substitute solution for third-party cookies to fight against fraud in online advertising by differentiates Bots vs. human. More importantly, Google Trust Tokens will distinguish bots from real humans and obsolete Third-party Cookies in Google Chrome.


Three Ways Machine Learning Can Change Incident Management

Without fully addressing the underlying issue, companies virtually guarantee that the same problem – or a similar one – will reoccur. Not identifying root cause often prevents a durable fix. In addition, companies lose the opportunity to proactively improve application code or infrastructure based on real-world experiences and issues. Postmortems may only result in reviews of monitoring and observability solutions and the inevitable updates to alert rules. Most DevOps professionals not only understand but have lived through these frustrations on an ongoing basis. Management, then, often wonders why their systems are so unstable. Changing the model for incident management has been limited by a combination of the overriding urgency combined with short-staffed, overworked teams. Although AI and machine learning has been positioned as the panacea for nearly every kind of technical ill, this is a clear case where “machines” could fundamentally enhance human efforts to improve a situation. The best troubleshooters exhibit a combination of instinct, experience and patience to carefully sift through reams of data, spotting unusual events and their correlation with bad outcomes.


How to get your multi-cloud data architecture strategy on track for long-term success

Technology innovation is constantly moving forward so don’t think single cloud. Even though you might feel that you’re saving time. Internal applications teams, and the databases and tools they leverage for data-rich applications, need to support multiple clouds. Take a long-term view towards resiliency when you might need to leverage multiple clouds for scale or times of duress for critical applications. Your strategy needs to work across multiple clouds and while you should pick the application that suits your immediate needs. Keep flexibility in mind so that you’re able to pick another cloud further down the line. The cloud now has very clear standards as defined by the Cloud Native Computing Foundation (CNCF). You should demand the same of your database. Most proprietary innovations are now becoming open source and standards across multiple cloud vendors. A perfect example of this is Kubernetes, which originated from Google over ten years ago. Stick to the standards, reduce custom development, and set yourself up for multi-cloud success. We have a lot to thank the cloud platform vendors including Amazon, Google, Microsoft, and others for. 


The Dark Side Of Tech: Four Potential Problems To Keep An Eye On In 2021

One of the biggest mistakes companies make when trying to automate their processes is they think technology, in and of itself, is the answer. As a result, they go out and buy subscriptions to software tools they never end up using — because they fail to properly integrate the tools into their existing organization and processes. As a result, between 32%-41% of what a company spends on software internally gets wasted. The reason is because the tools were not properly integrated into the human processes already in place. As AI continues to promise endless automation potential, this problem of “buying but failing to integrate” will likely accelerate. Companies will buy tools they think will solve all their problems without taking the time to think deeply about how to properly integrate those tools into their existing systems and infrastructure. A decade ago, it was unfathomable for most companies to host their data in the public cloud. Today, not only is that conventional wisdom, but it’s becoming increasingly popular and expected. For example, Netflix has been a prime example of how companies can get through the long and arduous process of migrating to the cloud — which started in 2008 when Netflix “experienced a major database corruption.”


Future of AI

Today, AI is hugely integrated in all aspects of our day-to-day lives. From chatbots, online shopping, smartphones, social networking to ride sharing - AI is being applied in everyday apps that we use. The huge amounts of data that all these apps are gathering about our likes and dislikes, our searches, our purchases, our movements and almost every aspect of our lives, is contributing further to advancement in AI. All this data is being used to train and fine-tune these AI and ML algorithms to learn and predict what we want with even more accuracy. ... AI has already being applied in healthcare with the use of chatbots to provide real time assistance to patients and to predict ICU transfers or patient risks. It has huge potential to transform how we administer healthcare in the future. AI algorithms will enable healthcare providers to analyze data and tailor healthcare to each patient. ML algorithms will keep learning as they interact with training data, to provide precise and accurate clinical decisions with respect to patient diagnostics, treatment and care, and predict patient outcomes. In the filed of transportation, one area where AI will continue to make improvements is self-driving vehicles. Google and Tesla have already launched autonomous cars.


Cloud Security Blind Spots: Where They Are and How to Protect Them

While most security practitioners know accidental data exposure is a common cloud security issue, many don't know when it's happening to them. This was the crux of a talk by Jose Hernandez, principal security researcher, and Rod Soto, principal security research engineer, both with Splunk, who explored the ways corporate secrets are exposed on public repositories. In today's environments, credentials are everywhere: SSH key pairs, Slack tokens, IAM secrets, SAML tokens, API keys for AWS, GCP, and Azure, and many others. A common risk scenario is when credentials aren't properly protected and left exposed, most often in a public repository – Bitbucket, Gitlabs, Github, Amazon S3, and Open DB, are the main public repos for software. "If you are an attacker and you're trying to find somebody that, either by omission or neglect, embedded credentials that could be reused, these would be your sources of leaked credentials," Soto said, noting these can help attackers pivot between endpoints and the cloud. Splunk researchers found there are 276,165 companies with leaked secrets in Github. ... More organizations have a "converged perimeter," a term he used to define environments with assets both behind an Internet gateway, such as DevOps and ITOps, and in the cloud.



Quote for the day:

"I think leadership's always been about two main things: imagination and courage." -- Paul Keating

Daily Tech Digest - May 21, 2021

What does CUI mean for government agencies?

Prior to NARA’s implementation of the CUI cyber security protection framework, government agencies employed ad hoc agency-specific policies, procedures, and markings to safeguard and control all unclassified information that did not meet the criteria required for classification. The rule was designed to primarily safeguard sensitive government data that had not been assigned as confidential or secret, whilst it was shared between different government and commercial entities. But this confusing patchwork resulted in inconsistent marking and safeguarding of documents, which led to unclear or unnecessarily restrictive dissemination policies and created barriers to authorised information sharing. Today, the CUI Program is a unified effort between Executive Branch agencies to standardise protections and practices across departments and agencies. It defines a central data classification policy for the handling, safeguarding and dissemination of ‘sensitive but unclassified’ (SBU) government information. NARA maintains a public CUI registry reflecting authorised CUI categories and subcategories, associated markings, and applicable data safeguarding, dissemination, and decontrol procedures as data moves through non-federal systems.


3 Reasons Many Of The World’s Most Booming Businesses Come From Humble Beginnings

Humble beginnings often require founders to reset their expectations, or even adapt the way they work so they can deal with the unusual circumstances that accompany starting their business in a garage or a small nook of their apartment. Such circumstances teach flexibility and other valuable leadership lessons in a way that starting in a cushy office never could. After all, you’ll learn a lot more about flexibility if you’re starting out of a home where your kids can come in and interrupt you at any given hour. This flexible mindset can also improve your creative thinking and leave you more open to new ideas. Studies have found that experts who practice humility actually become more flexible as they acquire more knowledge. ... Writing for Idealist Careers, Liz Peintner explains that such leaders “are especially effective in cultivating strong social relationships, helpfulness, forgiveness, and social justice amongst their team members; creating teams with more satisfied employees who stay longer at the organization; leading well in unpredictable situations by using a trial-and-error approach; and minimizing negative feelings and intentions toward ‘out-group’ members, resulting in a more inclusive work environment.”


Industry 4.0 and its impact on network architecture

Organisations will further benefit from strengthened flexibility and agility, in addition to offering better customer service. Industry 4.0 enables businesses to improve the service offered to customers with streamlined experiences and more choice for consumers. Following this, companies can achieve higher revenues and improved innovation opportunities, which will help to ensure that they receive a significant return on investment. However, with its benefits, businesses need to consider the challenges that derive from Industry 4.0 adoption when looking to transition their business and its operations. Organisations will need to evaluate the opportunity cost associated with the fourth industrial revolution. There are two major costs to review: technology and expertise. Having the understanding and knowledge of newer technologies can often lead to budget constraints and businesses will need highly skilled employees to manage the integration successfully. We have seen a number of organisations launch Industry 4.0 initiatives, but more often than not, a lack of direction and measurable objectives can lead to failure. 


The way we teach coding is all wrong. Here's how it needs to change

Hands-on experience will always be a deciding factor – though Lavenne acknowledges that the majority of students will be lacking this by default. Instead, he suggests university courses and coding programs encourage as much project work as possible – which will at least help equip them with a working knowledge the various components of the software development cycle. There are also a handful of specific tools and technologies that Lavenne feels that every aspiring developer should have under their belt. "Putting an emphasis on JavaScript and TypeScript is important; Node.js is a moving force of the world right now in web technologies and others. People have to start learning TypeScript in school," he says. "On the skillsets for languages that are super marketable; the technologies that are very marketable today are web and APIs. Every single software engineer that will come out on the market will work with APIs - they have to speak APIs, they have to speak JSON. XML is fading out into the distance; the world is speaking JSON from computer to computer, and REST APIs are everything." 


Putting digital at the heart of strategy

Consider the early days of the commercial internet. In the late 1990s, companies scrambled to launch websites, believing that having an online presence would differentiate them and hoping to achieve a first-mover advantage. But eventually, every company had a website. And companies competed, as they always do, on the strength of their broader strategies. We will see the same as companies embrace the digital pivots that support digital enterprises. Cloud computing, automation, and artificial intelligence will not provide meaningful differentiation in themselves. Instead, they will be the new platform on which companies will compete. We see two major ways that digitally driven strategies offer organizations the opportunity to succeed in the long term. The first is by enabling resilience: the ability to thrive amid uncertainty and change. The second is by driving differentiation: the ability to deliver value that cannot be found anywhere else. We explore each of these aspects below. To survive and thrive in an uncertain and rapidly changing world, organizations will need to innovate at speed, keep pace with technological and industry change, and cultivate greater resilience.


The Second Pillar of Trusted AI: Operations

One key aspect of designing a system around AI is recognizing that any model’s predictions are probabilistic. For example, in binary classification, our model makes predictions in the form of raw scores between 0 and 1. Based on an optimized threshold, the model predicts either class 0 or class 1. However, there are situations in which the model is not confident in a prediction – for example, when very near to that optimized threshold, in a “low confidence” region. There are other scenarios too when analyzing the scoring data or prediction we may have reason to doubt the veracity of the model prediction. So how do we translate this into real-time protection to ensure our model makes safe and accurate decisions at the level of an individual prediction? Using a set of triggers, such as identifying outliers or an unseen categorical value, the system can take certain predefined actions to guard against uncertain predictions. Consider a model that predicts whether or not an image is a dog or a wolf. Perhaps the training data was authored by a photographer using professional equipment. A new scoring image is taken by a different photographer with much lower-quality equipment, resulting in a blurry, small image.


Use Of Artificial Intelligence In Cyber Security

Since the known vulnerabilities in a System or Network or Databases are difficult to manage, machine learning and AI processes such as User and Event Behavioral Analytics (herein after referred to as “UEBA”) can observe all kinds of behavior of User accounts and servers. Further, it can identify or analyze any abnormal behavior that might gives a hint of a zero-day attack which can be useful to preserve Companies or Organizations before any vulnerabilities are formally reported and patched. UEBA solutions have 3 major functioning as mentioned here below: UEBA uses ‘Data Analytics’ which in turn utilizes data as per User’s behavior. Further, Statistical technique has been utilized in order to detect abnormal or unusual behavior and then alert System Administrators; UEBA uses ‘Data Integration’ includes that there will be data comparison through numerous sources with the already existing Security Systems; and UEBA uses ‘Data Presentation’ from which UEBA Systems tries to communicate its findings and generate reports. It issues a request to Security Analyst within an Organization to investigate unusual behavior.


The US pipeline attack shows the energy sector must act now on cybersecurity.

This threat environment is the new normal for oil and gas infrastructure. Whether attackers are criminals motivated by financial gain or nation-state actors playing geopolitics, digitized oil and gas infrastructure makes a tempting target. Board members – and the information security officers they hold accountable – should be preparing for frequent, sophisticated attacks to be an ongoing operational risk. Even for industry leaders keenly aware of the risks and trends facing the oil and gas industry, building robust cybersecurity can be a daunting challenge. The World Economic Forum White Paper Cyber Resilience in the Oil and Gas Industry: Playbook for Boards and Corporate Officers provides a new blueprint to secure critical infrastructure to help oil and gas industry leaders address cyber-risk and implement key recommendations within their organizations, as well as to champion standards across the energy ecosystem. This new playbook is a result of discussions and collaboration of the World Economic Forum community of oil and gas industry partners – including Siemens Energy and Saudi Aramco – that prompted and produced a guide to help oil and gas industry leaders address cyber-risk and implement key recommendations within their organizations, as well as to champion as standards across the energy ecosystem.


Using Low-Code Tools in Enterprise Application Development

To ensure security of the applications that the low-code platform is building, they must go through the same security checks just as any other application. Even though some level of security, such as input validation, is baked into most low-code development platforms, developers still need to pay a great deal of attention to security issues and test for vulnerabilities. However, because there is no visibility to what’s going on underneath, scanning the application for security checks becomes tedious. The same features that make low-code development so attractive to some organizations can bring challenges when it comes to security. Creating enterprise applications also entails a large chunk of integration. A low-code solution might be capable of handling things if a developer follows a carefully constructed “happy” path. We are not talking about relying on low-code solutions to simply integrate applications with software-as-a-service (SaaS) applications and simple web APIs, however. Enterprise apps often need to also connect with distributed systems, archaic legacy applications, overly complex third-party APIs, commercial off-the-shelf systems and much more.


AI and data science jobs are hot. Here's what employers want

Much of the problem boils down to a lack of appropriate skills among applicants. More than two-thirds of businesses said they struggled to find candidates with the right technical skills and knowledge, while a significant minority of others (40%) reported a lack of work experience, as well as gaps in industry knowledge. So, what exactly should candidates for AI and data-science roles have on their CVs to convince future employers? Technical skills, of course, are key: businesses said that they were in search of applicants who understand AI concepts and algorithms, know programming skills and languages, and are familiar with software and systems engineering. A number of employers, said Ipsos, stressed the importance of deep learning in specialist roles, and of the need for candidates to know how to go beyond "low-level" AI. "We need people coming through the university system to learn from first principles how to create deep learning, neural network systems, rather than relying on off-the-shelf systems that are available through the big US companies," said one micro-business owner.



Quote for the day:

"A leader has the vision and conviction that a dream can be achieved._ He inspires the power and energy to get it done." -- Ralph Nader

Daily Tech Digest - May 20, 2021

A new era of DevOps, powered by machine learning

While programming languages have evolved tremendously, at their core they all still have one major thing in common: having a computer accomplish a goal in the most efficient and error-free way possible. Modern languages have made development easier in many ways, but not a lot has changed in how we actually inspect the individual lines of code to make them error free. And even less has been done to keep your when it comes to improving code quality that improves performance and reduces operational cost. Where build and release schedules once slowed down the time it took developers to ship new features, the cloud has turbo charged this process by providing a step function increase in speed to build, test, and deploy code. New features are now delivered in hours (instead of months or years) and are in the hands of end users as soon as they are ready. Much of this is made possible through a new paradigm in how IT and software development teams collaboratively interact and build best practices: DevOps. Although DevOps technology has evolved dramatically over the last 5 years, it is still challenging. 


Productizing Machine Learning Models

Typically, there are three distinct but interconnected steps towards productizing an existing model: Serving the models; Writing the application’s business logic and serving it behind an API; and Building the user interface that interacts with the above APIs. Today, the first two steps require a combination of DevOps and back-end engineering skills (e.g. “Dockerizing” code, running a Kubernetes cluster if needed, standing up web services…). The last step—building out an interface with which end users can actually interact—requires front-end engineering skills. The range of skills necessary means that feedback loops are almost impossible to establish and that it takes too much time to get machine learning into usable products. Our team experienced this pain first-hand as data scientists and engineers; so, we built BaseTen. ... Oftentimes, serving models requires more than just calling it as an API. For instance, there may be pre- and/or post-processing steps, or business logic may need to be executed after the model is called. To do this, users can write Python code in BaseTen and it will be wrapped in an API and served—no need to worry about Kubernetes, Docker, and Flask. 


The timeline for quantum computing is getting shorter

Financial traders rely heavily on computer financial simulations for making buying and selling decisions. Specifically, “Monte Carlo” simulations are used to assess risk and simulate prices for a wide range of financial instruments. These simulations also can be used in corporate finance and for portfolio management. But in a digital world where other industries routinely leverage real-time data, financial traders are working with the digital equivalent of the Pony Express. That’s because Monte Carlo simulations involve such an insanely large number of complex calculations that they consume more time and computational resources than a 14-team, two-quarterback online fantasy football league with Superflex position. Consequently, financial calculations using Monte Carlo methods typically are made once a day. While that might be fine in the relatively tranquil bond market, traders trying to navigate more volatile markets are at a disadvantage because they must rely on old data. If only there were a way to accelerate Monte Carlo simulations for the benefit of our lamentably ladened financial traders! 


Pandemic tech use heightens consumer privacy fears

With user data the lifeblood of online platforms and digital brands, Marx said there were clear lessons for tech companies to learn in the post-pandemic world. Looking ahead, many study respondents agreed they would prefer to engage with brands that made it easier for them to control their data, up on previous years. Others called out “creepy” behaviour such as personalised offers or adverts that stalk people around the internet based on their browsing habits, and many also felt they wanted to see more evidence of appropriate data governance. Those organisations that can successfully adapt to meet these expectations might find they have a competitive advantage in years to come, suggested Marx. And consumers already appear to be sending them a message that the issue needs to be taken seriously, with over a third of respondents now rejecting website cookies or unsubscribing from mailing lists, and just under a third switching on incognito web browsing. Notably, in South Korea, many respondents said that having multiple online personas for different services was a good way to manage their privacy, raising concerns about data accuracy and the quality of insights that can be derived from it.


Why great leaders always make time for their people

When people can’t find you, they aren’t getting the information they need to do their job well. They waste time just trying to get your time. They may worry that, when they do find you, because you’re so busy, you’ll be brittle or angry. The whole organization may even be working around the assumption that you have no bandwidth. The sad truth, however, is that when you are unavailable, it’s also you who is not getting the message. You’re not picking up vital information, feedback, and early warning signs. You’re not hearing the diverse perspectives and eccentric ideas that only manifest in unpredictable, uncontrolled, or unscheduled situations—so, exactly those times you don’t have time for. And you’re not participating in the relaxed, social interactions that build connection and cohesion in your organization. So, though you may be busy doing lots of important stuff, your finger is off the pulse. But imagine being a leader who does have time, and how this freeing up of resources changes a leader’s influence on everyone below them. Great leaders know that being available actually saves time. A leader who has time would not use “busy” as an excuse. Indeed, you would take responsibility for time.


The road to successful change is lined with trade-offs

Leaders should borrow an important concept from the project management world: Go slow to go fast. There is often a rush to dive in at the beginning of a project, to start getting things done quickly and to feel a sense of accomplishment. This desire backfires when stakeholders are overlooked, plans are not validated, and critical conversations are ignored. Instead, project managers are advised to go slow — to do the work needed up front to develop momentum and gain speed later in the project. The same idea helps reframe notions about how to lead organizational change successfully. Instead of doing the conceptual work quickly and alone, leaders must slow down the initial planning stages, resist the temptation and endorphin rush of being a “heroic” leader solving the problem, and engage people in frank conversations about the trade-offs involved in change. This does not have to take long — even just a few days or weeks. The key is to build the capacity to think together and to get underlying assumptions out in the open. Leaders must do more than just get the conversation started. They also need to keep it going, often in the face of significant challenges.


With smart canvas, Google looks to better connect Workspace apps

The smart chips also connect to Google Drive and Calendar for files and meetings, respectively. And while the focus of the smart canvas capabilities is currently around Workspace apps, Google said that it plans to open the APIs for third-party platforms to integrate, too. “Google didn’t reinvent Docs, Sheets and Slides: They made it easier to meet while using them — and to integrate other elements into the Smart Canvas,” said Wayne Kurtzman, a research director at IDC. “Google seemingly focused on creating a single pane of glass to make engaging over work easier - without reinventing the proverbial wheel.” The moves announced this week are part of Google’s drive to integrate its various apps more tightly; the company rebranded G Suite to Workspace last year. “The idea of documents, spreadsheets and presentations as separate applications increasingly feels like an archaic concept that makes much less sense in today’s cloud-based environment, and this complexity gets in the way of getting things done,” said Angela Ashenden, a principal analyst at CCS Insight.


Graph databases to map AI in massive exercise in meta-understanding

"It is one of the biggest trends that we're seeing today in AI," Den Hamer said. "Because of this growing pervasiveness of this fundamental role of graph, we see that this will lead to composite AI, which is about the notion that graphs provide a common ground for the culmination, or if you like the composition of notable existing and new AI techniques together, they'll go well beyond the current generation of fully data-driven machine learning." Roughly speaking, graph databases work by storing a thing in a node – say, a person or a company – and then describing its relationship to other nodes using an edge, to which a variety of parameters can be attached. ... Meanwhile, graph databases often come in handy for data scientists, data engineers and subject matter experts trying to quickly understand how the data is structured, using graph visualisation techniques to start "identifying the likely most relevant features and input variables that are needed for the prediction or the categorisation that they're working on," he added.


Data Sharing Is a Business Necessity to Accelerate Digital Business

Gartner predicts that by 2023, organizations that promote data sharing will outperform their peers on most business value metrics. Yet, at the same time, Gartner predicts that through 2022, less than 5% of data-sharing programs will correctly identify trusted data and locate trusted data sources. “There should be more collaborative data sharing unless there is a vetted reason not to, as not sharing data frequently can hamper business outcomes and be detrimental,” says Clougherty Jones. Many organizations inhibit access to data, preserve data silos and discourage data sharing. This undermines the efforts to maximize business and social value from data and analytics — at a time when COVID-19 is driving demand for data and analytics to unprecedented levels. The traditional “don’t share data unless” mindset should be replaced with “must share data unless.” By recasting data sharing as a business necessity, data and analytics leaders will have access to the right data at the right time, enabling more robust data and analytics strategies that deliver business benefit and achieve digital transformation.


How AI could steal your data by ‘lip-reading’ your keystrokes

Today’s CV systems can make incredibly robust inferences with very small amounts of data. For example, researchers have demonstrated the ability for computers to authenticate users with nothing but AI-based typing biometrics and psychologists have developed automated stress detection systems using keystroke analysis. Researchers are even training AI to mimic human typing so we can develop better tools to help us with spelling, grammar, and other communication techniques. The long and short of it is, we’re teaching AI systems to make inferences from our finger movements that most humans couldn’t. It’s not much of a stretch to imagine the existence of a system capable of analyzing finger movement and interpreting it as text in much the same way lip-readers convert mouth movement into words. We haven’t seen an AI product like this yet, but that doesn’t mean it’s not already out there. So what’s the worst that could happen? Not too long ago, before the internet was ubiquitous, “shoulder surfing” was among the biggest threats faced by people for whom computer security is a big deal. Basically, the easiest way to steal someone’s password is to watch them type it.



Quote for the day:

"Distinguished leaders impress, inspire and invest in other leaders." -- Anyaele Sam Chiyson