Daily Tech Digest - May 27, 2021

Event-driven architecture: Understanding the essential benefits

Event-driven architectures address these problems head-on. At its core, event-driven architecture relies upon facilitating inter-service communication using asynchronous messaging. In the asynchronous messaging pattern, a service sends information in the form of a discrete message to a message broker and then moves on to other activities. On the broker side, that message is consumed by one or many interested parties at their convenience. All communication happens independently and discretely. It’s a “fire and forget” interaction. However, while things do get easier by focusing on component behavior instead of managing the burdens of endpoint discovery that go with inter-service communication, there is still a good deal of complexity involved in taking a messaging approach. In an event-driven architecture, a component needs to understand the structure of an incoming message. Also, a component needs to know the format and validation rules for the message that will be emitted back to the broker when processing is complete. Addressing this scope of complexity is where a schema registry comes into play.


Prevention Is the Only Cure: The Dangers of Legacy Systems

As companies organize their first post-pandemic steps back into an office or hybrid workflow, the threat from legacy systems is greater than ever. In part, this is due to the legacy of shadow IT, in which systems or devices are introduced without explicit IT department approval. It is more than likely that the rapid shift to work from home caused an uptick in shadow IT, attack surfaces, and exposure to related vulnerabilities. Large corporations like Shell have proven time and again that they are vulnerable to these attack vectors, but they may not need to be as concerned about shadow IT as their midsize counterparts. While large staff size may increase the potential for mismanagement, major corporations are also more likely to have systems and audits in place to manage their environment and control changes. Many midsize businesses and enterprises may be less aware of weaknesses in their system that leave them exposed to shadow IT's risks. How can a firm prevent the proliferation of legacy or shadow IT? The only solution is the proper management of all aspects of IT.


Data is the Cure for What’s Ailing the Supply Chain

The data generated through an end-to-end RFID solution provides an additional key benefit: sustainability. Fact: As the decade progresses, there will be increasing pressure on companies to help achieve global climate goals, while meeting increased consumer demands. According to a recent report from McKinsey, “consumer companies will have to greatly reduce the natural and social costs of their products and services to capitalize on rising demand for them without taxing the environment or human welfare.” Because data captured through an RFID solution is utilized to increase the velocity of moving goods onto trucks, it can also be used to configure and optimize space on the truck according to the most efficient route to the destination of packages. Aggregating the data enables movement of packages along the most efficient route. Accurately boxed and shipped packages results in fewer trucks on the road and fewer airplanes in the air. Simply stated, the carbon footprint is minimized through data: simply by knowing what you have, and in what order you should be delivering it.


The evolution of the modern CISO

With remote work poised to remain a mainstay in societal patterns, and growing interest in a “work from anywhere” mentality continues, the onus to be adaptable has never been higher for CISOs. For CISOs, there’s a fine balance between continuing to make progress on strategic initiatives that will reduce risk and improve security maturity, while also being adaptable enough to stop and pivot as needed. Further, as businesses adapt to meet the growing needs of the customer, the business needs to do so with CISOs in mind in order to stop and ask the right questions to enable secure-from-the-start—such as, “Will this new technology we’re onboarding potentially open up new security gaps?” or “Does branching into new sectors open our business up to new areas of attack?” and “Could we expose our customer base to threats by switching CRM platforms?” To be able to answer these questions, CISOs need to be able to adapt across three major areas that are constantly shifting and inherently intertwined: the needs of the business and customer, the current threat landscape, and risk calculation and prioritization.


A better future of work is coming, but only if we make the right choices now

It sounds like an almost idealistic vision of work, particularly for those who have spent the majority of their careers tied to a desk with little say over how they approach or structure their role. But evidence increasingly suggests that those businesses who do offer greater flexibility, including the ability to work remotely, are likely to attract and retain the most-skilled workers. Tech workers in particular are getting more choosey about where they work. After a year of rapid digital transformation across industries, demand for professionals with cloud, cybersecurity and software development skills are peaking. Couple that with the fact that tech workers increasingly want to work remotely , and will potentially change roles if it enables them to do so, and it stands to reason that flexible-working policies are crucial to enticing top digital professionals. However businesses pursue their agendas, Wilmott and Walsh both warn that organizations risk introducing further inequalities if they don't manage the move to hybrid delicately For workers whose roles are primarily on-site, Walsh says there is room for organizations to explore how they can offer at least some of the benefits enjoyed for those who can work remotely. 


Embed Data Science Across the Enterprise

The organizational model should reflect the maturity of the AI capabilities. For organizations just getting started with AI, a centralized model typically builds critical mass. Then, it should be distributed. I like a hybrid model. Being fully decentralized is like herding cats—the technology does not get leveraged effectively. At J&J, IT owns the technology that underpins AI: cloud computing, data repositories, APIs. But we’ve embedded the data science in our functions, where it is closest to the business problems we’re looking to solve. We’ve also created a data science council, with representation from each part of the business, that oversees our portfolio, talent, and technology. ... Think of data as an asset. Very few companies are embracing data as the core of insight and decision-making. That requires spending time to understand where you’re at with data and where you want to go. Also, never underestimate the need for change management in increasing the use of AI. I could develop all the models in the world, but if no one is using them, there’s no value. You’ve got to work with the people who want to evolve.


How E-Commerce Is Being Forced to Evolve In a Post-Covid World

The entirety of the Covid-19 pandemic has served as a case study in the cascading effects of social change. Things barely imaginable just over a year ago are now a part of daily life. In the context of enterprise, these changes manifest as both shifts in consumer behavior and new challenges for businesses — none of which have been unaffected. Against this background, the role of e-commerce has grown immensely. It’s not just niche items being ordered online anymore; in 2020, over 50% of all consumers used direct-to-consumer sales channels to buy everyday items like groceries, cleaning products and other consumer-packaged goods. In fact, online grocery sales more than doubled, growing 54% last year and nearly reaching the $100 million mark. Brick-and-mortar was suffering even before the pandemic hit, but now appears to be actually on the decline, with fewer new storefronts opening in 2020 than in any of the three years prior. Even more surprisingly, a full 60% of interactions between consumers and businesses now take place online.


Keeping up with data: SaaS, unstructured data and securing it with identity

More than four out of 10 companies admitted they don’t know where all of their unstructured data is located. Nearly every company surveyed reported managing access to unstructured data as difficult, specifying numerous challenges such as too much data, a lack of single access solution for multiple repositories and lack of visibility into access – including where data lives and who owns it. It is unsurprising, given this data, that a Canalys report found companies spending record sums on cyber security in order to protect the rapid digital transformation we have experienced over the last year. 50% of European businesses stated that investing in new security technology was their highest prevention spending priority. Yet, despite these efforts and intentions, the number of successful attacks continues to be higher than ever, with Canalys reporting that “more records were compromised in just 12 months than in the previous 15 years combined.” ... Looking even more closely at the research, we can connect the dots between these findings and the rise in cloud adoption, the unstructured data that resides in the apps and systems in the cloud, and IT’s attempts at securing this monster network of information.


Everything You Need To Know About Stress Testing Your Software

Stress testing is a type of testing that verifies the reliability and stability of software applications. The goal of this kind of testing is to measure the error handling capabilities of the software to ensure that it does not crash under extremely heavy load conditions. Let us consider two scenarios where website traffic may increase: During an online sale or holiday season, a website can experience a tremendous spike in online traffic; When a blog site is mentioned by an influencer or news outlet, then the website can experience increased traffic; It is imperative to stress test a website to ensure that it can accommodate spikes in traffic, as failure to do so will result in a loss in revenue and an inconsistent brand image. To summarize, the following can be listed as some of the end goals of running a software stress test: A stress test helps to analyze which type of user data (if any) is corrupted; The test helps to determine any triggers that signal risk; It helps to identify any hardware features that can affect software endurance; Stress test helps predict failures connected to high loads.


Preparing Future Data Scientists for Jobs of Tomorrow

Cooperation between The Data Mine and the private sector has included discussions with the Central Indiana Corporate Partnership, which Ward says focuses on economic development in the state and interconnection between companies. Through his conversations with colleagues from the C-suite, Ward says many executives do seem to understand the need for data science talent and actively foster connections with universities. That could see more people with data science enter the workforce as well as give internal staff chances to grow in their careers. The Data Mine is a way for Purdue to keep up with the changing technology, infrastructure, and tools of data science, Ward says. “Think of data-driven projects as being much larger than just science or just engineering, maybe with some traditional disciplines.” ... The focus of the program is student centric, with Ward’s office in the students’ residence hall. As companies retool and think ahead for how data driven insights can lead to business impact, he says Data Mine works with students on nine-month projects over the academic year. “We’ve completely gotten away from the idea of having a 10-week internship,” Ward says. “It’s just way too short to have a substantial experience.”



Quote for the day:

"The test we must set for ourselves is not to march alone but to march in such a way that others will wish to join us." -- Hubert Humphrey

Daily Tech Digest - May 26, 2021

Reduce process bottlenecks with process advisor for Power Automate, now generally available

“Process advisor simplifies the hardest thing about process analysis – sharing details of a process across an organization. With the recording and sharing function in process advisor, it makes it very easy for process and business analysts to collaborate and to identify opportunities to optimize their automation workflows.” —Brian Hodel, Principal Power Platform Developer, T-Mobile. Learn more about T-Mobile adding RPA to their Six Sigma toolbox. “By leveraging RPA in Microsoft Power Automate we anticipate a time savings of 90 percent in processing time for our operations, and the potential to see reduced maintenance cost of up to of 20 percent, which aligns with our value of simplicity in our manufacturing work. Looking forward, we are excited to gain deeper insights into how we work and where we might benefit from automation with process advisor to streamline and digitize our operator rounds.” —Linda W. Morris, Enterprise Automation Lead, Chemours. Learn how Chemours automated SAP reducing processing time. “Process advisor allows us to gain real insights into how work is getting done.


Why Conscious Economics Is the Leadership Style of the Future

Conscious consumerism and conscious investment are not new philosophies. Conscious investment, often referred to as “impact investing,” is a market worth more than $700 billion annually with 20% projected yearly growth. As its name implies, impact or conscious investing is practiced for the intent of benefiting from better investment decisions for the good of companies, customers and society as a whole. Conscious consumerism has likewise been on the rise in recent years with more consumers choosing to shop at local smaller businesses rather than retail giants. When we combine both of these together to encompass the “conscious” aspect for investors, businesses and consumers alike, it creates the dynamic referred to as “conscious economics." I was first introduced to this term two years ago when I attended an event by the Economic Club of Canada in Toronto. The event was headlined by a conversation between former President Barack Obama and Rhiannon Rosalind, the club’s president, CEO and owner. ... In other words, as aspects of our societies change and evolve around us, so too does our own inherent psychology — particularly our motivations for making changes to our way of life.


The rise of the cloud data platform

Many companies have different terms for what I’ve termed the cloud data platform. Oracle, for example, labels it the “enterprise data management cloud.” Nutanix uses the term “enterprise cloud.” And Cloudera, which offers a platform called the Cloudera Data Platform, actually calls the category the “enterprise data cloud.” “The enterprise data cloud is incredibly important to regulated verticals like banking, telcos, life sciences and government,” Cloudera’s Hollison said. “And they don’t want, for example, to have a bespoke security and governance model for each individual analytic function.” The structure imposed on regulated organizations by, well, regulations benefited them last year, when they needed to grow their universe of data sources. But for those without a common structure to help engineers prepare and manage data from two related but separate silos found themselves wholly unprepared for the task. For them, part of the obstacle was that, almost by default, an enclosed model with its own dedicated dataset comes with all the data preparation and engineering, security, governance and MLOps it needs. 


Rise in Opportunistic Hacks and Info-Sharing Imperil Industrial Networks

Brubaker, who worked on the Mandiant incident response team for the Triton attack, says that worries him. "These actors are building expertise and willingness [to make] contact with other actors. What if they meet up with a ransomware group" and combine forces, he asks. "That would make ransomware more impactful on OT." That concerns him. Dragos' Sergio Caltagirone, vice president of threat intelligence at the ICS security firm, called the City of Oldsmar attack "the perfect example" of the type of ICS attack his firm frequently sees. It's not so much the feared, sophisticated ICS custom-malware type of attack by more well-resourced nation-state hackers, but threat actors breaking in via unknown ports left wide open on the public Internet, or weak or compromised credentials. "A network that is unprepared and indefensible, but by an organization doing their best but that's chronically under-resourced and under-funded to protect itself ... it's a confluence of [more adversaries]" going after ICS networks and a failure of these networks to operate the most basic security practices, Caltagirone says.


IoT helps make return-to-work safer

Innovatus Capital Partners, an independent adviser and portfolio-management firm, wants to ensure that as business leaders and employees return to the office their expectations for clean, safe environments are met. To that end, the firm has deployed a smart air-quality monitoring system at its offices in Illinois and Tennessee. The system combines technologies from Veea, an edge-computing company, and Wynd Technologies, a provider of portable air purifiers. “Workers re-entering the commercial office space after Covid-19 need assurance that they are in the cleanest environment possible,” says Bradley Seiden, managing director at Innovatus. “That means you have to be able to measure the environment—specifically the air quality in the environment.” The company deployed air-quality monitoring sensors throughout common areas where they collect air metrics such as mold and CO2 levels, temperature, humidity, etc. The sensors can also identify the presence of airborne particles with signatures that might indicate the presence of coronavirus and various flu strains. 


Four proactive steps to make identity governance a business priority

In many cases, employees have access privileges to company information that they don’t need. This has only proliferated during the COVID-19 pandemic. Consider all the hiring changes—millions being laid off, furloughed, adjusting to remote or hybrid work models, taking up side hustles or gig jobs, or getting new jobs as the economic dust settles. Ensuring access is revoked when employees go and that new hires only have access to what they need is an arduous task, and one that many businesses let fall to the wayside. Deprovisioning is the best way to address this problem, but revoking privileges can create IT downtime, disrupt workflow, and is another undertaking many aren’t signing up for voluntarily. But when you consider that all it takes is one disgruntled former employee or savvy hackers ready to take advantage of your loose access privileges, it’s time to get serious. Fortunately, automation can help streamline the deprovisioning process by matching privileges and access of users to the level of security those systems require. From there, the system can automatically restrict a user’s access to certain enterprise systems based on their role. 


How Data Science and Business Intelligence Can improve Strategic Planning in organizations?

Data science consist of various methods and processes that support and guide the extraction of information and knowledge from raw data. Data Science if used properly has vast applications in business. A business analyst will work with business administration and take part in EDA which is an approach to analyze datasets, summarize their main characteristics, working with the data and refining the data so that it can be put to use productively. With large amounts of data at our disposal, businesses can make better business, financial and marketing decisions. If a business has previous data of which product sold well at which time or at which locations, they can work in a way to increase sales. Big Data helps retail outlets and fast-moving consumer goods sellers a lot. With proper data, various important decisions can be made which can improve profits. Data-driven decision making has many applications. For example, in Finance, it might be figuring out the most cost-effective way to use cloud services, or hire new staff. Or it might be the cheapest way to promote a new product.


Let’s Talk Quantum – In Defense & Warfare

This article is NOT intended to showcase the dark side aspects of quantum computing, rather intension is to highlight the possible applications of this groundbreaking technology. Defense scientists of many countries are taking a closer look at the impact that Quantum Computing, Quantum Communications and IoT will have on their national security and defense. It is believed that of the two areas, Quantum Encryption and Quantum Sensors will have an enormous impact in this field in coming years. Use of quantum computers in communications that can revolutionize Underwater Warfare is of paramount importance in the defense world. The Quantum Computation and Quantum Communication will also revolutionize “Defense Logistics”. Declining cycle times, increased awareness of the situation and more efficient communication are just some of the advantages that quantum computation or quantum communication will offer in the field of Defense Logistics. Technologies like Artificial Intelligence, Virtual Reality, Augmented Reality and Blockchain are already in use to enhance defense capabilities.


Combatting Insider Threats with Keyboard Security

The human interface device, the keyboard, often overlooked when companies look to implement internal security measures, is also the place where almost all insider threats begin. Organizations need to prioritize the use of security-enhanced keyboards that can stop threats before they can even be entered into the network. Many well-known thin client manufacturers already support the use of secure mode and have integrated the necessary software for this. Recent keyboard improvements can also now provide higher security through two-factor authentication using a smart card. Keyboards can also now come equipped with a contactless card reader that can read RFID and NFC cards or tags. These new security-equipped keyboards can make an array of safety applications possible; for example, ID systems can be used for closed user groups via the keyboard, and company IDs can be easily read in. These keyboards can then be partnered with innovative mouse technology, that have integrated fingertip sensors for user authentication, to greatly improve security.


Cloudless IoT: IoT Without the Cloud

Privacy is not the only reason why you’d want to avoid the cloud. Other reasons include stability, persistence, data privacy, security, and necessity. When it comes to stability, if the Internet connection is unstable, the cloud may be difficult to reach, making the system unstable. In persistence, cloud services may go away, so avoiding the cloud may allow the IoT system to run forever without relying on the hosting company to persist. Additionally, with data privacy, sometimes data should not leave the location where it is generated and a secure network can provide fewer network connections meaning fewer attack vectors. Lastly, sometimes there just isn’t Internet access. Any cloud-based software may become unreachable if the cloud goes down. It can happen to the best of us. Any IoT solution that wholly depends on the cloud will go out if the cloud goes down. Even worse, the cloud may go away altogether. Maybe the company that runs the servers goes out of business. Or maybe it just isn’t economically viable to keep it running. This has happened many times. Sometimes the reason for not wanting to use the cloud is very simple: Internet access just isn’t available.



Quote for the day:

"Leadership is liberating people to do what is required of them in the most effective and humane way possible." -- Max DePree

Daily Tech Digest - May 25, 2021

What are the ingredients of digital transformation success?

Finding the right tech talent is a pressing issue for executives and a new study finds that the right talent is hard to come by regardless of how successful firms are with their enterprise modernization efforts. Successful firms recruit, invest and retain knowledgeable staff (71%) and work with trusted partners (76%) to compensate for whatever skill and culture gaps exist within their organization, according to the report, "Secrets of Successful Digital Transformation," by Forrester and global software consultancy ThoughtWorks. ... Decision-makers at successful organizations reported that a true cross-functional transformation process includes stakeholders from all parts of the organization such as IT, business, finance and more having involvement in the modernization initiatives, according to the report. "An effective modernization culture and strategy must include strong leadership, including support and guidance from executives and, perhaps most importantly, a dedicated budget to execute transformations,'' the report said. It also requires a monetary commitment. In fact, 71% of successful organizations fund their enterprise modernization programs through a dedicated digital transformation budget.


Staying Safe Online: 6 Threats, 9 Tips, & 1 Infographic

Sharing pictures of major life events or everyday moments on social media may seem fairly innocent. However, you should probably be more careful. Everyone has access to that information. Skilled cybercriminals have no trouble tracking down your relationships and other details about your life. They may use what they find to trick your friends into giving up sensitive information. It’s not hard to find out dates of birth, email addresses, interests, and details about family members, which makes it even easier for hackers to break into your account (see the first tip to avoid this!). ... Sometimes, website content may seem too appealing not to visit. You might even go ahead and create a profile, sharing your personal information. You should be careful, though, because not all websites are safe places. Who knows what malicious programs and scams are hidden there? Before doing anything, make sure you check the website address. URLs beginning with “https” are safer than ones with “http” because the letter “s” stands for security. Another thing to look out for is a small lock sign near the URL. Nowadays, web browsers are able to recognize safe websites and mark them as secure with this sign.


Return to Office Risks Worth Considering

"Organizations should have adjusted their business continuity and disaster recovery plans to account for the shift to remote work at the onset of the pandemic," said John Beattie, principal consultant at business continuity solution provider Sungard Availability Services. "These plans need to be readjusted again to account for employees being back in the office and any changes made to the IT environments as a result." Failing to tighten cybersecurity protocols upon the return to the workplace could leave networks vulnerable to cyberattacks and breaches. Additionally, failing to update the business contingency and recovery plans and failing to provide employees notice of plan changes could lead to outages or the inability to promptly act on contingency plans when the time comes, Beattie said. ... Ger Doyle, head of Manpower IT brand Experis and head of digital and innovation at ManpowerGroup, warns that companies moving toward a new, hybrid way of working must be careful to avoid a two-speed workplace in which those in the office get access to opportunities that work-from-home employees miss.


Five ways to use data to make better business decisions

Even though data might be digitized, it still may not be relevant for decision-making. If the data isn't valuable, it should be considered for elimination. Deciding which data to keep is a balancing act. There is data that isn't important today but could become valuable at a later date. However, there is other data (IoT 'jitter', for example, or memos about a company holiday party 20 years ago) that most likely won't ever be relevant to decision-making and should be eliminated. Master data management frequently focuses on normalizing or consolidating disparate data fields from different systems that refer to the same piece of information. However, there is also the need to aggregate unlike types of data, such as aggregating a weather report with photos or videos of a storm system. Data aggregation is most successful when business use cases are clearly identified, along with all of the data and data combinations that are needed for decision making. With the growth of citizen development and separate IT budgets in business user departments, it's more difficult for IT to know where packets of underexploited data might reside, and how to bring these data troves into a central data repository so that everyone in the business can use them. 


Dubai’s DMCC opens Crypto Centre to tap into blockchain's potential

The Dubai Multi Commodities Centre has set up a new space that will house companies developing crypto and blockchain technology. The Crypto Centre is the result of a partnership with Switzerland’s CV Labs, the organisation behind the Swiss government-backed Crypto Valley. It is part of the free zone’s own Crypto Valley – an ecosystem for cryptographic, blockchain and distributed ledger technology entities in the UAE. “This is a fantastic new development. Crypto and blockchain technology has enormous potential to transform global trade and supply chains ... and this aligns perfectly with the DMCC’s vision to drive the future of trade,” said Ahmad Hamza, free zone executive director at the DMCC. “Over the next few weeks and months, we will see this centre filled with [companies] ... looking to scale up their crypto businesses,” he said. He did not disclose the number of entities that DMCC expects to attract to the centre. The DMCC, which presides over companies involved in the trade of commodities that range from pulses to diamonds, registered 2,050 new companies last year, a five-year high for the free zone. 


Nikola Tesla: 5G Network Could Realize His Dream of Wireless Electricity

The experiments used new types of antenna to facilitate wireless charging. In the laboratory, the researchers were able to beam 5G power over a relatively short distance of just over 2 meters, but they expect that a future version of their device will be able to transmit 6μW (6 millionths of a watt) at a distance of 180 meters. To put that into context, common Internet of Things (IoT) devices consume around 5μW—but only when in their deepest sleep mode. Of course, IoT devices will require less and less power to run as clever algorithms and more efficient electronics are developed, but 6μW is still a very small amount of power. That means, for the time being at least, that 5G wireless power is unlikely to be practical for charging your mobile phone as you go about your day. But it could charge or power IoT devices, like sensors and alarms, which are expected to become widespread in the future. In factories, for instance, hundreds of IoT sensors are likely to be used to monitor conditions in warehouses, to predict failures in machinery, or to track the movement of parts along a production line. 


Understanding AI cloud

The most compelling advantages of AI cloud are the challenges it addresses. It democratises AI, making it more accessible. By lowering adoption costs and facilitating co-creation and innovation, it drives AI-powered transformation for enterprises. The cloud is veritably becoming a force multiplier for AI, making AI-driven insights available for everyone. Besides, though cloud computing technology now is far more prevalent than the use of AI itself, we can safely assume that AI will make cloud computing significantly more effective. AI-driven initiatives, providing strategic inputs for decision-making, are backed by the cloud’s flexibility, agility, and scale to power such intelligence massively. The cloud dramatically increases the scope and sphere of influence of AI, beginning with the user enterprise itself and then in the larger marketplace. In fact, AI and the cloud will feed off each other, aiding the true potential of AI flower through the cloud. The pace of this will depend only on the AI expertise that enterprises can bring to bear in their workplace activities, for the cloud is already here and seeping everywhere.


Navigating the benefits and challenges of network and security transformation

Obtaining any necessary board-level buy in for transformation projects is only half the battle. Any significant change project will require consideration of the organisational arrangement and availability of specialist skills. The Netskope/Censuswide research found that 50% of global CIOs believe that a lack of collaboration between specialist teams is stopping them from realising the benefits of digital transformation projects. For context, assuming that 50% of CIOs are responsible for 50% of the $6.8 trillion digital transformation spend IDC predicts, then we are looking at a situation where a spend equivalent to the entire annual US tax income is in jeopardy because teams are failing to work together effectively. ... The researchers discovered that while just under half of security and networking teams report to the same boss, 37% of participants stated that ‘the security and networking teams don’t really work together much’. In fact, nearly half of the networking and security professionals described the relationship between the two teams as ‘combative’ ‘dysfunctional’, ‘frosty’ or ‘irrelevant’. They all agree that this imperfect relationship has the potential to derail huge plans.


Real estate tech takes on the housing boom in this seller's market

"Proptech is most important in cities where there is a large transient population, places that have a strong presence of universities, hospitals and a strong job market," said Blum. "This includes major cities like Philadelphia, Boston, San Francisco, NYC, Houston, Chicago, Miami. These cities have already begun to rebound quickly. I've spoken to agents from major cities across the country and they all say the same thing that anything that can help save them time with their business is greatly welcomed." The new platform Localize "harnesses the power of AI [artificial intelligence] to provide a cutting-edge experience for homebuyers and brokers," explained Omer Granot, Localize president and COO. "[W]e streamline the house-hunting journey through" property insights and "our concierge texting service, Hunter by Localize." Hunter curates properties specifically for each homebuyer through its "Smart Matching technology." It uses more than 100 data insights that are associated with a listing as well as a homebuyer's specific preferences to send daily recommendations to prospective buyers "to find them the perfect home. 


Ethical Decisions in a Wicked World: the Role of Technologists, Entrepreneurs, and Organizations

"Wicked problem" is a term introduced by the theorists Rittel and Webber (1973) to describe problems that cannot be definitively described, with no "solutions" in the sense of definitive and objective answers. It is also understood as a super-category of "complexity", problems that overwhelm us in some sense. There is also a class of "super-wicked" problems: climate change, poverty, food security, energy supply, education policy and public health. They all have many interdependent factors making them seem impossible to solve. The software industry faces wicked problems in different ways: by developing complex software systems and by managing them as part of a larger social, economic, and environmental fabric. Wicked problems always existed in our industry, but the internet and globalization undoubtedly created conditions for new forms of interaction, thus expanding the universe of related wicked problems. Examples of wicked problems closely associated with software are: social networks, sharing economy platforms, and air traffic control. In business, a new strategy (e.g. re-branding) or a modification in a product (e.g. introducing a new version of a video game) are classic examples.



Quote for the day:

"No great manager or leader ever fell from heaven, its learned not inherited." -- Tom Northup

Daily Tech Digest - May 24, 2021

How can banks mitigate the risks of consumers’ poor cyber hygiene practices?

To successfully implement adaptive authentication, banks and financial institutions must implement robust risk analytics – a sphere in which AI is playing an increasingly large role. This is no surprise, given that the threats to banks are becoming more sophisticated, with the emergence of attacks-as-a-service, automated attack tools, and close collaboration amongst bad actors enabling fraud at an unprecedented scale. An AI-powered decision engine and machine learning model can continuously analyse a broad range of data, events and context. Rather than simply detecting login and transaction data, they look at a whole variety of indicators of compromise and learn from them. These include malicious headers, referrers from a phishing site, malicious cookies, a malicious device or IP, inhuman speed, keyboard overlay, a debugger running and many more. Based on the risk level of each user action, a smart risk analytics solution can generate a score and provide a recommended next step in real time – enabling banks to remain proactive, rather than reactive. So, with the complexity of attacks growing and fraudsters’ sophistication evolving on an almost a daily basis, it’s clear that users cannot and should not be expected to keep up.


The Future State of the Cloud

If findings from a decade’s worth of research are predictive, these technologies and tools are the ones that may experience increased adoption: Configuration management tools: Growth of configuration management tools is on the rise, though this varies among tools; many organizations now use more than one. Back in the early days of this research, Chef and Puppet ruled the roost, each peaking as high as the 50% adoption mark among enterprises in the 2019 report. It was at this point that we started to see increased experimentation with Ansible and Terraform, each of now adopted by more than one-third of all enterprise respondents. This coincides with Puppet and Chef experiencing significant decreases in adoption, with even fewer organizations planning on using and/or experimenting with these tools;  Platform-as-a-Service (PaaS): Recently, there is continued experimentation and increased adoption of public cloud PaaS services. These include data analytics, artificial intelligence and machine learning (AI/ML), and the internet of things (IoT). ... Increasingly, today’s industry relies on services such as these that are becoming standard parts of operations.


The RPA world desperately needs standards

The absence of RPA design standards capable of detailing process automations in a universally understood manner is also a major contributor to stalled automation pipelines. Look, for example, at process discovery tools, a key component of any automation toolchain. Without RPA standards that would assure compatibility and interoperability, process discovery tools detail discovered processes in different ways. This leaves RPA users with little choice but to transcribe processes manually before they can ever start to be developed and deployed in target automation platforms. As a result, automations stall, more money has to be spent, and more time is wasted. Growing awareness of these standardization issues, coupled with the inability of RPA to scale or deliver on anticipated ROI, is causing many companies to rethink additional automation investments. ... To better understand the kind of incredible impact industry standards can have, look at the example provided by the Portable Document Format, or PDF. After the PDF was released as an open standard by Adobe, the ability not only to save a PDF in any word processor, but also to open it in another tool suddenly unlocked a level of portability that previously had been impossible to attain.


5 Strategies to Infuse D&I into Your Organization

The CEO needs to take a public stance, embed D&I in the organization’s purpose, exemplify the culture, and take responsibility for progress toward goals. They need to be out front, even if a CDO is part of the team. PwC’s U.S. chairman, Tim Ryan, has been an exemplar for at least five years. He co-founded CEO Action for Diversity and Inclusion after police shootings in the summer of 2016 to spur business executives to collective action on D&I. The publication of PwC’s workforce diversity data in 2020 revealed that women and people of color are underrepresented, especially at senior levels, showing that even the most dedicated companies still have a lot of D&I work to do. Nielsen’s CEO, David Kenny, added the CDO title to his leadership portfolio in 2018 so he could “set hard targets for ourselves and make those transparent to our board and measure them like we measure other outcomes like financial results.” He relinquished that title to a new CDO in March 2020, noting the D&I progress his team had already made. If you’re a board member, you have an essential role to play in D&I governance.


Explainable AI (XAI) with SHAP - regression problem

Model explainability becomes a basic part of the machine learning pipeline. Keeping a machine learning model as a “black box” is not an option anymore. Luckily there are tools that are evolving rapidly and becoming more popular. This guide is a practical guide for XAI analysis of SHAP open source Python package for a regression problem. SHAP (Shapley Additive Explanations) by Lundberg and Lee  is a method to explain individual predictions, based on the game theoretically optimal Shapley values. Shapley values are a widely used approach from cooperative game theory that come with desirable properties. The feature values of a data instance act as players in a coalition. The Shapley value is the average marginal contribution of a feature value across all possible coalitions. In this guide we will use the Boston house prices dataset example from sklearn datasets. It is a simple regression problem. ... The SHAP framework has proved to be an important advancement in the field of machine learning model interpretation. SHAP combines several existing methods to create an intuitive, theoretically sound approach to explain predictions for any model.


Super-Secure Processor Thwarts Hackers by Turning a Computer Into a Puzzle

To stop attacks, Morpheus randomizes these implementation details to turn the system into a puzzle that hackers must solve before conducting security exploits. From one Morpheus machine to another, details like the commands the processor executes or the format of program data change in random ways. Because this happens at the microarchitecture level, software running on the processor is unaffected. A skilled hacker could reverse-engineer a Morpheus machine in as little as a few hours, if given the chance. To counter this, Morpheus also changes the microarchitecture every few hundred milliseconds. Thus, not only do attackers have to reverse-engineer the microachitecture, but they have to do it very fast. With Morpheus, a hacker is confronted with a computer that has never been seen before and will never be seen again. To conduct a security exploit, hackers use vulnerabilities in software to get inside a device. Once inside, they graft their malware onto the device. Malware is designed to infect the host device to steal sensitive data or spy on users. The typical approach to computer security is to fix individual software vulnerabilities to keep hackers out. 


Cybersecurity is Now Essential to Corporate Strategy. Here's How to Bring the Two Together.

Compliance is not security. This is an essential difference to understand. Compliance is about checking the same processes to meet some pre-established requirements and procedures. Security is about continually monitoring for new and unexpected vulnerabilities. The best way to think of this important difference is as though there is an (ideally) impenetrable net covering every component of your business. Compliance checks the state of that net at a moment in time and from an established list of criteria, but it isn’t checking for a continually growing set of new threats that are not yet on the list. Security requires ongoing vigilance for unexpected vulnerabilities. It’s very much a real time and continuous effort. When it comes to cybersecurity planning, the lesson for businesses is that following established processes is not enough. It’s about anticipating what could happen or what could possibly go wrong. Security is like an ongoing and engaged state of being — it needs active and ongoing vigilance and maintenance to remain operational and be ready to pivot when the expected happens.


Can Your Enterprise Benefit from No-Code AI?

There are many ways no-code AI can be used in businesses, including small businesses looking to find ways to embrace the power of automation. Here are just a few examples of how no-code AI is impacting different industries. Several financial services firms have started incorporating no-code AI into their workflows to improve security and provide an enhanced customer experience. By using no-code AI, the entire customer experience can be streamlined. Let’s take an example of a loan application. Using no-code AI, financial services teams can build an ML model to quickly scan loan applications and determine which ones meet the required criteria. The underwriting team now has more time to focus on approved applicants instead of spending all their time sifting through applications. As different teams need new ML models to improve their processes, they can use a no-code AI platform to create them. This makes their operations more efficient because they no longer need to wait for their IT team or data scientists to develop a new model every time a need arises.


Blockchain, when and why to use it in business processes

The feature of intrinsic disintermediation and crystallization of traceability of the transferred asset are among the most innovative requirements of blockchain technology, which has and will have increasing impacts on the evolution of social and organizational models, as well as positive impacts in terms of technological process innovation. Service providers can interface with the blockchain to offer advanced functionality to users, for example API integrations services. ... Blockchain makes it possible to track when and by whom a given change was made, which is why blockchain technology is spreading in all scenarios where it is required to ensure traceability and authenticity for a product or service, such as the agri-food supply chain. In addition, another widespread application is that of notarization or crystallization of data on blockchain, which ensures the association of a certain date. Another application on which various projects and concrete initiatives have focused is that of smart contracts, i.e. the automatic activation, based on distributed ledger software technologies, of contracts between private individuals upon the occurrence of certain events or conditions predefined by two or more parties.


AI is no villain: six steps to build your AI strategy

During AI transformation projects, companies often make the mistake of separating the vision from the execution, resulting in disjointed and complicated AI programs that can take years to consolidate. This can be easily avoided by choosing AI solutions based on concrete business objectives that have been established at the project’s outset. It’s important to align your corporate strategy with measurable goals and objectives to guide your AI deployment. Once complete, the strategy can be easily escalated down into divisional- or even product-level strategies. ... Identify the real problem; don’t assume it is AI. This might seem like common sense, but the problems you’re looking to overcome have a large impact on your success. Some problems are not AI problems at all, and for the ones that are, the business should advocate the delivery through small lighthouse projects that act as a beacon for their capabilities. In identifying ‘lighthouse’ projects, your business will need to assess the overall goal and importance of the project, its size, likely duration and data quality. Lighthouse projects tend to be able to be delivered in under eight weeks, instead of eight months, and will provide an immediate and tangible benefit for the business and your customers.



Quote for the day:

"Time is neutral and does not change things. With courage and initiative, leaders change things." -- Jesse Jackson

Daily Tech Digest - May 23, 2021

Qualcomm reveals tiny Linux-driven 5G NR chipset for IoT

The 315 5G chipset offers up to a 1.54Gbps data rate for 5G (3GPP Rel 15), while the 4G mode goes to 400Mbps. Other features include antenna tuning support and a dual-frequency GNSS location capability. The combination of 7nm technology, Cortex-A7, and an efficient RF front-end design enables up to 50 percent smaller modules than existing models, claims Qualcomm. Vanghi also touted the chipset for its low power consumption and extended life maintenance through 2028 to 2030. The Qualcomm 315 5G IoT Modem-RF “can be easily fitted onto industrial machines,” said Vanghi. “You can bolt it directly onto the chassis using existing holes.” The small size will also make it easy for wireless module manufacturers to upgrade existing 4G modules, said Vanghi, mentioning support for 35 x 40mm module footprints. “The 315 is a pin-to-pin compatible solution for LTE legacy modules,” he added. Vanghi noted that the chipset has all the security features of Qualcomm’s premium 5G chipsets for smartphones, which would include the Snapdragon X55 5G Modem-RF System. Security features include hardware-based cryptography, TrustZone, Qualcomm TEE, secure boot, secure storage and key provisioning, and debug security.


Keeping Technology Change Human

When users can accurately predict their efficiency with a tool, even when that tool itself is inefficient, they can strongly resist a change. I worked on an inflight commerce system, and our solution required a series of reconciliation steps to be taken at the end of a flight. The crew instinctively know how long this process takes through repetition of the process, and they set aside that time - at what is generally a very stressful point in the flight. Coming in to land is when everyone suddenly wants to be out of their seat! Changes to the software (and hence the process) around reconciliation were always difficult to achieve buy-in for, because the nervousness around trying something new at such a critical point in the crew's operational life was always a tough sell. Our software was one of the multiple tasks taking place at that time, and a change to one can lead to underperformance in any of the others. Nobody wants distracted staff on a plane. Changes to software or processes mean a risk to their ability to deliver predictably to the business, and that could have catastrophic consequences for a user's role.


The future of the IoT (batteries not required)

When the two technical co-founders looked to expand their startup, they tapped a collection of their newly minted PhD students who had the expertise of developing wireless system-on-chip technologies in the lab. Today, Everactive has expanded into a team of nearly 90 industry veterans and technical experts, including talented minds like Alice Wang, who joined up with Calhoun and Wentzloff in 2018 after successful stints with industry giants Texas Instruments and MediaTek. Another MIT alum, she now serves as VP of hardware for Everactive, directing both silicon and hardware systems design. “We’re exceptionally proud of the team that we’ve developed,” says Wentzloff. “I think a large part of why we continue to succeed is that we’ve done a great job of surrounding our core technology students with a broad set of talented industry leaders.” Thanks to their advances in ultra-low-power circuits and wireless communication, Everactive sells full-stack industrial IoT solutions powered by their always-on Eversensors, harvesting energy exclusively from the surrounding environment. The sensors can be deployed at a larger scale than battery-powered devices, and they cost less to operate.


Can Nanotech Secure IoT Devices From the Inside-Out?

Sowder said that many times, “the challenge with these IoT devices is the limited compute capability that they have on them. An IP camera can’t run a full IPS protection suite against traffic to it. It has a job to record video and send it upstream.” He pointed to the potential solution of nanotechnology: Specifically, the concept of a nanoagent on each IoT node that inspects firmware code to determine if it’s engaged in malicious behavior, such as memory corruption. If so, the nanoagent can block it in real-time. The challenge is how to do it with a small footprint, Sowder said: “A lot of devices don’t have a lot of compute. Sticking a firewall in front of every IP camera simply isn’t feasible. The solution is a very, very slight agent. It phones home to get a device signature, including what kind of device it is and what can run on it.” Nanoagents don’t put a lot of overhead on these devices, so the devices’ performance isn’t slowed down, Sowder noted: “There’s no overhead to prevent them from performing their functions.” Check Point has been working on a lightweight agent that relies on a cloud instance to pull down specific protection details related to that device.


How Mirroring the Architecture of the Human Brain Is Speeding Up AI Learning

Several decades of neuroscience research suggest that the brain’s ability to learn so quickly depends on its ability to use prior knowledge to understand new concepts based on little data. When it comes to visual understanding, this can rely on similarities of shape, structure, or color, but the brain can also leverage abstract visual concepts thought to be encoded in a brain region called the anterior temporal lobe (ATL). “It is like saying that a platypus looks a bit like a duck, a beaver, and a sea otter,” said paper co-author Joshua Rule, from the University of California Berkeley. The researchers decided to try and recreate this capability by using similar high-level concepts learned by an AI to help it quickly learn previously unseen categories of images. Deep learning algorithms work by getting layers of artificial neurons to learn increasingly complex features of an image or other data type, which are then used to categorize new data. For instance, early layers will look for simple features like edges, while later ones might look for more complex ones like noses, faces, or even more high-level characteristics.


Developer burnout and a global chip shortage: The IoT is facing a perfect storm

Part of the problem stems from unprecedented demand for IoT devices. There are already more connected things than people in the world, and the trend isn't showing any sign of slowing down. In fact, it's quite the contrary: tech analyst company IDC recently estimated that there will be a total 41.6 billion connected devices by 2025. Consumers are particularly interested in using smart products in their homes – think connected plugs, lightbulbs, thermostats and even fridges. Forrester forecast that by 2025, the average US household will have 20 internet-connected devices. In this context, it won't be enough for manufacturers to produce more of the same old things. Buyers' expectations are growing: they want easy-to-use devices with new, exclusive features, which will be continuously improved; and crucially, consumers expect that their connected products work together across different platforms and operating systems. More than eight in ten respondents to Forrester's survey said that they need to rapidly manufacture new smart products and services to maintain or grow their market position – meaning, in most cases, that a new cycle of research and development is necessary.


Developing an api architecture

There is almost always a 1:1 relationship between the api layer and the application layer. An api endpoint will only call one usecase, and a usecase will most likely only be used by one api endpoint. Why not just combine them into a single function? Loose coupling. For example, although I am using express for my server, I may want certain use cases to be accessed via a CLI instead/as well. The application layer does not care if a request comes via the web api, or the cli, or some other method. It just cares about the arguments it receives. The application, core, and infrastructure layers are hard to talk about in isolation (which is ironic) so the next few sections will be a bit intertwined... How does the application layer actually "do stuff" though? If we want to get the basket, for example, how does it do this? We wouldn't want the application layer to import the database and query it directly, this would couple our low level implementation too tightly to the high level use case. The core layer holds interfaces for all of the things the application can do. When I say interfaces, I mean typescript interfaces, there is no actual javascript here, purely types and interfaces.


4 Of The Fastest Growing Cyber Security Skills In-Demand By Business In 2021

Application development security is analyzing vulnerabilities in the app, developing and adding security features to protect it from hackers. As the field of modern software development catch up speed, more threat actors exploit the rapid production of application as a chance to attack vulnerabilities in your code. Fortunately, there are application development security experts to protect your data and digital assets from a hacker. Application security is no more an afterthought. To build a secure application, one must integrate security measures in all software development life cycle parts. Burning glass report makes this evident with demand in Application development security skills to increase 164%, topping the list among other cybersecurity skills. ... Cloud security refers to all the measures, policies, and rules implemented to protect the data in the cloud from hackers. On account of businesses making a shift to the cloud, robust cloud security is necessary. Security threat is continually evolving and becoming more complex, which means cloud computing is at no less danger than the on-premises environment.


What’s next: Machine learning at scale through unified modeling

Model unification can be useful for many types of machine learning problems. Our experience with predictive models, which are widely used by organizations across industries, has shown three important conditions that should be met for taking a unified modeling approach: A prediction is needed for the same target variable across a large number of related entities, or partitions; Each partition uses the same set of features; The models need to be refreshed on a frequent basis. ... With unified models, teams lose some flexibility for addressing problems since it is not possible to pick and choose individual partitions to roll back (or roll forward). A team can address this issue by retraining the unified model outside of the regular refresh cycle. Alternatively, if necessary, the model can be reverted for all partitions at once, across the board. For example, if you’ve created a unified model to predict demand for the full range or a set of your company’s products, you may find, after deploying the model, issues with the results for one product. You will then need to either roll back or retrain the full model.


Digital Transformation: The value of intelligent operations versus business process outsourcing

Companies that have adopted intelligent operations within their processes are viewed as moving up their operational maturity level from “stable” to “efficient”. Once operational efficiency is achieved, the next step is to include data-driven insights into the decision-making process, putting the companies at the “predictive” maturity level. Companies that go beyond this stage are called “future-ready”. In these companies, artificial intelligence, blockchain, cloud and various forms of intelligent operations are used to drive and grow the company. According to the report, only 7% of organisations globally fall into the “future-ready” category, and these are mostly in the insurance and high-tech sectors. On average, “future-ready” organisations showed a 2.8 times boost in corporate profitability and 1.7 times increase in operational efficiency compared with companies in the lower maturity levels. Accenture Operations associate director Pankaj Jain says when new technologies are introduced, the way a company runs its operations changes dramatically.



Quote for the day:

"What lies behind us & what lies before us are tiny matters compared to what lies within us." -- Ralph Waldo Emerson

Daily Tech Digest - May 22, 2021

Universities Are Failing Software Developers

First, universities need to re-examine their curricula -- and do so often, because technology, trends, and best practices move lightning-fast in our industry. You would think that the ever-evolving nature of software development is common knowledge, yet year after year, I meet with candidates who only know Python, Java, or C++. These coding languages are often taught because of existing school material, exercises, tests, and labs, but they aren’t as widespread in professional settings because, frankly, there are better languages with larger communities targeting a larger set of applications or devices. At my company, for instance, we prefer to primarily work with Typescript/Javascript, C#, and PHP, all of which come with great frameworks and libraries. In theory, software development or computer science is a very practical university major, with many obvious applications available immediately after graduation. But if universities want this to be true in practice, they need to do a much better job of teaching real, marketable skills that employers actually value. In addition to updating the hard skills being taught to students, university leaders need to emphasize the importance of softer skills like critical-thinking, problem-solving, communication, and project management. 


Cybersecurity And The Vaccine Passport: A Dream Ticket Or A Flight Of Fancy?

There is currently no clearly defined standard for the vaccine passport. The Biden administration has announced that there would be no central, national policy, leaving the private sector to create its own solutions. Many projects — including those of IBM, the International Air Transport Association and several individual airlines — are already underway. Depending on where you intend to travel, this could mean handing over personal information and login credentials to multiple airlines and industry bodies. The more places that store this information, the more vulnerable it is to breach and loss. This lack of any agreed standard also opens the system up to fraud and manipulation. Cybercriminals are already working the impact of the pandemic to their advantage, and a patchwork of vaccine passport systems presents another golden opportunity. Because of the emphasis on equity for rollout, vaccine passports will have to be both paper and digital. For those who do not have digital devices, paper passports will show up on the dark web — the same way that we see fake vaccine passports already showing up for a few hundred bucks.


Cybersecurity, emerging technology and systemic risk: What it means for the medical device industry?

A personal goal of mine is that within 5 years, I can talk to any medical device developer about cybersecurity and find that they have comprehensive knowledge of all aspects of creating a secure device. To achieve that, I partnered with Axel Wirth to write and publish the world’s first comprehensive, how-to book on medical device cybersecurity. Also, Velentium has launched a training certification process to train engineers, developers, and managers at medical device manufacturers (MDMs) and other embedded and IoT device designers, so they’ll have qualified, knowledgeable cybersecurity expertise on-staff. According to a recent (ISC)² report, the global cybersecurity talent gap remains at more than 3 million. Cybersecurity employment must grow by 41 percent in the U.S. and by 89 percent worldwide to fill the existing gap. Clearly there is huge shortfall of talent in the IT arena, but the situation is far worse for the embedded device arena. Skilled people simply are not available.


Hacker's guide to deep-learning side-channel attacks: the theory

A side-channel attack is an implementation specific attack that exploits the fact that using different inputs results in the algorithm implementation behaving differently to gain knowledge about the secret value such as a cryptographic key used during the computation using an indirect measurement such as the time the algorithm took to execute the computation. One of the most infamous cases of timing attack, is the fact that the time taken by the naive square-and-multiply algorithm used in textbook implementation of the RSA modular exponentiation depends linearly on the number of "1" bits in the key. This linear relation can be exploited by an attacker, to infer the number of "1" bits in the key by timing how long it takes to perform the computation for diverse RSA keys. He can then use this knowledge to guess the number 1 in an unknown RSA key stored in a hardware crypto device by simply measuring how long the code takes to run. While nowadays most hardware crypto-implementation have constant time implementation, timing attacks are still actively used, mostly in blind SQL injection.


JavaScript API to Recognize Humans vs Bots in Chrome

Privacy Pass, an open-source web extension, was the step towards the right direction, keeping privacy at its core. It helps to bypass CAPTCHA challenge repetition by using a set of Tokens/Passes. Let’s look at how it works. Users have to download the Privacy Pass extension for Chrome/Firefox web browser. You can see the Privacy Pass icon; Visit the CAPTCHA website and answer the CAPTCHA challenge, which grants 30 Tokens/Passes; These tokens are stored in the extension for future use. The concept is simple when the user visit’s another page, the Privacy Pass extension issue these Token/Passes. And the great thing here is that each of these Token/Passes goes through a cryptographic process known as “blinding” that shields users’ privacy. ... Google has recently started developing a Trust Token API. It was developed as a substitute solution for third-party cookies to fight against fraud in online advertising by differentiates Bots vs. human. More importantly, Google Trust Tokens will distinguish bots from real humans and obsolete Third-party Cookies in Google Chrome.


Three Ways Machine Learning Can Change Incident Management

Without fully addressing the underlying issue, companies virtually guarantee that the same problem – or a similar one – will reoccur. Not identifying root cause often prevents a durable fix. In addition, companies lose the opportunity to proactively improve application code or infrastructure based on real-world experiences and issues. Postmortems may only result in reviews of monitoring and observability solutions and the inevitable updates to alert rules. Most DevOps professionals not only understand but have lived through these frustrations on an ongoing basis. Management, then, often wonders why their systems are so unstable. Changing the model for incident management has been limited by a combination of the overriding urgency combined with short-staffed, overworked teams. Although AI and machine learning has been positioned as the panacea for nearly every kind of technical ill, this is a clear case where “machines” could fundamentally enhance human efforts to improve a situation. The best troubleshooters exhibit a combination of instinct, experience and patience to carefully sift through reams of data, spotting unusual events and their correlation with bad outcomes.


How to get your multi-cloud data architecture strategy on track for long-term success

Technology innovation is constantly moving forward so don’t think single cloud. Even though you might feel that you’re saving time. Internal applications teams, and the databases and tools they leverage for data-rich applications, need to support multiple clouds. Take a long-term view towards resiliency when you might need to leverage multiple clouds for scale or times of duress for critical applications. Your strategy needs to work across multiple clouds and while you should pick the application that suits your immediate needs. Keep flexibility in mind so that you’re able to pick another cloud further down the line. The cloud now has very clear standards as defined by the Cloud Native Computing Foundation (CNCF). You should demand the same of your database. Most proprietary innovations are now becoming open source and standards across multiple cloud vendors. A perfect example of this is Kubernetes, which originated from Google over ten years ago. Stick to the standards, reduce custom development, and set yourself up for multi-cloud success. We have a lot to thank the cloud platform vendors including Amazon, Google, Microsoft, and others for. 


The Dark Side Of Tech: Four Potential Problems To Keep An Eye On In 2021

One of the biggest mistakes companies make when trying to automate their processes is they think technology, in and of itself, is the answer. As a result, they go out and buy subscriptions to software tools they never end up using — because they fail to properly integrate the tools into their existing organization and processes. As a result, between 32%-41% of what a company spends on software internally gets wasted. The reason is because the tools were not properly integrated into the human processes already in place. As AI continues to promise endless automation potential, this problem of “buying but failing to integrate” will likely accelerate. Companies will buy tools they think will solve all their problems without taking the time to think deeply about how to properly integrate those tools into their existing systems and infrastructure. A decade ago, it was unfathomable for most companies to host their data in the public cloud. Today, not only is that conventional wisdom, but it’s becoming increasingly popular and expected. For example, Netflix has been a prime example of how companies can get through the long and arduous process of migrating to the cloud — which started in 2008 when Netflix “experienced a major database corruption.”


Future of AI

Today, AI is hugely integrated in all aspects of our day-to-day lives. From chatbots, online shopping, smartphones, social networking to ride sharing - AI is being applied in everyday apps that we use. The huge amounts of data that all these apps are gathering about our likes and dislikes, our searches, our purchases, our movements and almost every aspect of our lives, is contributing further to advancement in AI. All this data is being used to train and fine-tune these AI and ML algorithms to learn and predict what we want with even more accuracy. ... AI has already being applied in healthcare with the use of chatbots to provide real time assistance to patients and to predict ICU transfers or patient risks. It has huge potential to transform how we administer healthcare in the future. AI algorithms will enable healthcare providers to analyze data and tailor healthcare to each patient. ML algorithms will keep learning as they interact with training data, to provide precise and accurate clinical decisions with respect to patient diagnostics, treatment and care, and predict patient outcomes. In the filed of transportation, one area where AI will continue to make improvements is self-driving vehicles. Google and Tesla have already launched autonomous cars.


Cloud Security Blind Spots: Where They Are and How to Protect Them

While most security practitioners know accidental data exposure is a common cloud security issue, many don't know when it's happening to them. This was the crux of a talk by Jose Hernandez, principal security researcher, and Rod Soto, principal security research engineer, both with Splunk, who explored the ways corporate secrets are exposed on public repositories. In today's environments, credentials are everywhere: SSH key pairs, Slack tokens, IAM secrets, SAML tokens, API keys for AWS, GCP, and Azure, and many others. A common risk scenario is when credentials aren't properly protected and left exposed, most often in a public repository – Bitbucket, Gitlabs, Github, Amazon S3, and Open DB, are the main public repos for software. "If you are an attacker and you're trying to find somebody that, either by omission or neglect, embedded credentials that could be reused, these would be your sources of leaked credentials," Soto said, noting these can help attackers pivot between endpoints and the cloud. Splunk researchers found there are 276,165 companies with leaked secrets in Github. ... More organizations have a "converged perimeter," a term he used to define environments with assets both behind an Internet gateway, such as DevOps and ITOps, and in the cloud.



Quote for the day:

"I think leadership's always been about two main things: imagination and courage." -- Paul Keating