Daily Tech Digest - January 03, 2021

Recommendations By Artificial Intelligence Vs. Humans: Who Will Win?

When pitted against recommendations by humans, AI need not necessarily always have a win-win situation. It is true that data-driven recommendations are always preferred; however, the preferences to accept humans and artificial intelligence based recommendations differ with respect to situation and use case. It all stems from the ‘word-of-machine effect.’ Recently, an article on “When Do We Trust AI’s Recommendations More Than People’s?” by University of Virginia’s Darden Business School Professor Luca Cian and Boston University’s Questrom School of Business Professor Chiara Longoni, was published in the Harvard Business Review. In the article, they explained this phenomena as a widespread belief that AI systems are more competent than humans in dispensing advice when utilitarian qualities are desired and are less competent when the hedonic qualities are desired. The article authors clarify that, it doesn’t imply that artificial intelligence is competent than humans at assessing and evaluating hedonic attributes nor are humans in the case of utilitarian attributes. As per their experiment results, suppose someone is focused on utilitarian and functional qualities, from a marketer’s perspective, the word of a machine is more effective than the word of human recommenders.


5 Unusual SEO Tactics That Will Boost Your Performance

Conversions occur when a visitor to your site completes a desired action/goal. That could be anything from making a purchase to signing up for your newsletter – you get to set the parameters for your conversion goals. When building out your pages, it’s important to keep these goals in mind in conjunction with your SEO strategy. Conversion goals and strategy should vary from organic to ad landing pages. However, you can learn from both marketing strategies. When building out a landing page, be sure to tailor it to a specific purpose. If you intend to use it for ads, it’s important to clearly display the information you advertised would be there. Likewise, if you’re optimizing a landing page for organic traffic, be sure that your content matches what you signal is there to search engines. Then, compare results! A landing page can act for ads and SEO in tandem, but only if you do it right. If you start noticing that your SEO traffic is converting much higher than your ads, then maybe it’s not the ideal landing page for your ads budget. But, if the landing page is meant to serve for both ads and SEO and SEO isn’t converting well at all, rethink your strategy. Why? Aside from the fact that you need to know where high-converting traffic comes from, Google is already aware of your stats.


Where Are The Self Driving Robotaxis Of India

When it comes to self-driving in India, there are only a handful of startups. Amongst these startups, those who are genuinely working on fundamental research are even fewer. According to Sanjeev Sharma, founder of Swaayatt Robots, solving self-driving problems requires fundamental research in the fields of theoretical computer science and applied mathematics. Although there are over 300 startups globally, most of the companies are working on DMS and ADAS (advanced driver assistance system). This is only one tiny problem of the autonomous driving problem. There are actually three bigger problems to solve — perception, planning, and localisation. If one tries to solve the problem very accurately, which is what most companies are doing, the challenge would be to minimise the computation time. ... The ugly truth is that self-driving technology is a tough nut to crack. We are at least five years away from even witnessing level 3 autonomy on roads. India has one of the toughest roads in the world. The models that work well in relatively empty roads of the United States will falter in Bengaluru or Delhi crowded roads. So, this is not just a problem exclusive to India. The world is yet to figure out self-driving tech.


Why Banks’ Digital Sales Efforts Still Aren’t Working

While the industry earned many kudos for pushing through so many Paycheck Protection Program loans as quickly as it did, D’Acierno says that experience also underscores the lack of digital readiness most institutions had. PPP was a relatively cookie-cutter program but getting applications completed and processed remotely took tremendous handholding and manual labor in many institutions, he explains. Few business owners interested in PPP assistance could find an Amazon-style customer experience, D’Acierno says. “Ideally, digital should be an easier channel,” says D’Acierno, “but the downside of digital is that the customer is just one click away from giving up and saying, ‘You’ve just made this too hard for me’.” Finding another potential bank or credit union is as close as doing a quick Google search, he points out. Solving the digital sales challenge is a practical matter, not an academic one. While they tend to have narrower product lines, direct banks and fintechs routinely operate where many mainstream banks haven’t been able to go, seamlessly. The problem: Consumers and business can obtain extensive online services from these newcomers and from nonfinancial companies, so the bar is higher for digital sales.


Data-driven 2021: Predictions for a new year in data, analytics and AI

George Fraser, CEO of Fivetran, says "I think 2021 will reveal the need for data lakes in the modern data stack is shrinking." Adding that "...there are no longer new technical reasons for adopting data lakes because data warehouses that separate compute from storage have emerged." If that's not categorical enough for you, Fraser sums things up thus: "In the world of the modern data stack, data lakes are not the optimal solution. They are becoming legacy technology." Data lake supporters are even more ardent. In a prediction he titled "The Data Lake Can Do What Data Warehouses Do and Much More", Tomer Shiran, co-founder of Dremio, says "data warehouses have historically had...advantages over data lakes. But that's now changing with the latest open source innovations in the data tier." He mentions Apache Parquet and Delta Lake as two such innovations and lesser known projects Apache Iceberg and Nessie as well. Together, these projects allow data to be stored in open, columnar formats across file systems, versioned and processed with transactional consistency. Martin Casado, General Partner of Andreessen Horowitz, put it this way: If you look at the use cases for data lakes vs. data analytics, it's very different.


Now AI is Knocking On The Doors of Luxurious Hotels

AI in hospitality and tourism is still a new development that has prospects for new earning models. Though chatbots exist, they can be taken to a new level. High-grade chatbots can effectively reduce the cost of hiring personnel. Combining AI with the right data mining and acquisition tools is essential for hotels to learn as much information about tourists and vacationers as possible. This way, hoteliers can tailor their experiences to meet specific individual needs. AI will be able to sort through big data faster and automate actions based on deduced inference. Hoteliers can incorporate mobile booking and hotel recommender engines with several other event booking software. This idea provides a “one-stop-shop” for event attendees to book for events and as well get hotel recommendations and be able to book for spaces; all within the same application. This solution will drive up booking numbers in no time and will bring mobile bookings closer to those who need it the most. Ultimately, the task of collecting and analyzing data will be streamlined by technology that is smart enough to make well-planned choices about guest behavior and characteristics. Incorporating artificial intelligence to solve user demands in the hospitality industry is a quantum leap forward in terms of implementable technologies.


How Will 5G Influence Healthcare Cybersecurity

While 5G is generally accepted to be more secure than the 4G we use now, the technology still poses a few notable risks. In November of 2019, a joint research initiative between security researchers at Purdue University and the University of Iowa revealed an incredible 11 significant vulnerabilities in studied 5G networks. The study noted that these security lapses could allow bad actors to surveil and disrupt device operations — or even launch falsified emergency alerts. These findings are troubling for the risks they highlight and because they prove that the vulnerabilities that 5G was meant to resolve are still an ongoing problem. Equally problematic is the ease with which these security holes can be abused. As a writer for TechCrunch noted in an article on the study, researchers “claimed that all the attacks could be exploited by an adversary with a practical knowledge of 5G and 4G networks and a low-cost software-defined radio.” All this said, cybersecurity in the 5G era does warrant some optimism. Because next-gen wireless tech is designed with network slicing in mind (i.e., organizing several isolated virtual networks within an overarching physical infrastructure) it will be harder for bad actors to access the broader system. Slicing also allows for better privacy, because information isn’t shared across isolated “slices,” and for better tailoring, because organizations can apply different policies across varying inner networks.


Resilience As A Competitive Advantage

The growing focus on resilience will likely follow the same trajectory we saw with security and privacy. In the 1980s and ’90s, computer security was an occasional irritant. Attacks, however, became more frequent, sophisticated and devastating, where commerce froze and real money was stolen. Security became centralized, and automated and users became more vigilant. Similarly, privacy was initially treated as a concept that would blow over. “You have zero privacy anyway. Get over it,” joked Sun Microsystems co-founder and CEO Scott McNealy in 1999. In 2020, privacy has become one of the top concerns of consumers, investors, employees and regulators — and a difficult challenge for some of the top companies in health and technology. The increasing damage being inflicted by extreme weather and actions such as forced power outages in California has begun to compel us to confront our relative lack of preparedness. Covid-19 has further underscored this and made the idea of investing for unforeseen risks less of a sunk cost and more of a necessity — it has given shape, substance and urgency to worst-case-scenario planning. Three of the primary technologies for improving resilience will likely be AI, IoT and 5G.


Farewell to Flash

As the standardisation of HTML5 and supported media formats grew, the advantages of Flash for providing video declined, until it was primarily used for interactive games and some interactive applications. However, Flash suffered from the same issues that had meant the JVM didn't take off in browsers a decade earlier; constant updates for security vulnerabilities meant that Adobe Flash was the primary cause of CVEs in web browsers and infections. To be fair to both Flash and the JVM; downloading programs from the internet is always going to be a vector for vulnerabilities, and the security of a remote system is always going to be as good or bad as the implementation – and as the complexity of those runtimes grew, particularly in unmanaged languages like C++ – the danger was real. Even today, bugs in image rendering pipelines or font decoding are the primary cause of vulnerabilities in browsers. Flash's demise started with Steve Jobs' post "Thoughts on Flash" (web archive link), who had recently launched the iPhone in 2007 with 'always on' internet connectivity.



Quote for the day:

"Authority without wisdom is like a heavy axe without an edge, fitter to bruise than polish." -- Anne Bradstreet

Daily Tech Digest - January 02, 2021

What the hell is an AI factory?

Here’s how the AI factory works. Quality data obtained from internal and external sources train machine learning algorithms to make predictions on specific tasks. In some cases, such as diagnosis and treatment of diseases, these predictions can help human experts in their decisions. In others, such as content recommendation, machine learning algorithms can automate tasks with little or no human intervention. The algorithm– and data-driven model of the AI factory allows organizations to test new hypotheses and make changes that improve their system. This could be new features added to an existing product or new products built on top of what the company already owns. These changes in turn allow the company to obtain new data, improve AI algorithms, and again find new ways to increase performance, create new services and product, grow, and move across markets. “In its essence, the AI factory creates a virtuous cycle between user engagement, data collection, algorithm design, prediction, and improvement,” Iansiti and Lakhani write in Competing in the Age of AI. The idea of building, measuring, learning, and improving is not new. It has been discussed and practiced by entrepreneurs and startups for many years.


SaaS : The Dirty Secret No Tech Company Talks About

The dirty little secret I have found is that, in most cases, this promised state just isn’t the case. The more SaaS companies I’ve seen, the more I’ve witnessed great companies forced to become service businesses to scale. Having a services team isn’t bad; it can even produce a lot of benefits for customers. But many times it ends up being necessary in SaaS. As with all things that involve consultants, it’s going to take longer and cost more to get your product(s) live. Put frankly, this process sucks, and it’s not the SaaS dream. Especially today, when organizations need to do more with less, adding heads just to get your product live seems like another problem to deal with, not a solution. SaaS products were supposed to be delivered via the cloud almost instantly. The same SaaS product was going to work for every customer, and once we built a brand, it was gonna be glorious. WTF happened?! I grew just as frustrated as some of you likely are. As part of the founding team at Behance, I lived this myself. We built a beautiful portfolio-sharing platform employed by millions of people, which we eventually sold to Adobe. Our platform became the engine that powered portfolios for design institutions including the Rhode Island School of Design (RISD), Savannah College of Art and Design (SCAD), School of Visual Arts (SVA), and the American Institute of Graphic Arts (AIGA), among others.


Top 7 NLP Trends To Look Forward To In 2021

With advances in NLP and the increasing demands in customer service, one can expect major strides towards next-gen bots that can hold complex conversations, self-improve, and learn how to carry out tasks that have not been previously trained on. Due to a rise in remote working situations in 2020, there has also been a tremendous increase in customer support tickets across industries. It has become a major task to deal with increased ticket volume and provide quick responses to urgent queries. One can expect the integration of NLP tools with help desk softwares to perform tasks such as tagging and routing of customer support requests, thereby requiring human intervention in just higher-value tasks. The success of automated machine learning or autoML in effectively dealing with real-world problems has prompted researchers to develop more automation and no-code tools and platforms. One such area is automation in natural language processing. With AutoNLP, users can build models like sentiment analysis with just a few basic lines of code. This encourages wider participation in the machine learning community, earlier thought to be restricted to just developers and engineers.


AI Is Reengineering All Aspects Of Our Human Experience: What Are The Implications?

We have come together to fight Covid-19 and AI was a key enabler to bring to market vaccines, in unprecedented clinical trial R&D timeframes, to eradicate this virus, and help us get back to a more interactive global community where we can freely travel, visit our favourite restaurants and shop with more access in our local retailer stores. This is an excellent example of AI being used for good. However, much of AI in large global data sets are full of inequalities, incumbencies and biases of the innovators designing AI which have a direct impact on how the technology guides human information, perception and action. As AI leads society towards the next phase of human evolution, it is becoming increasingly more evident that we need to acutely increase our knowledge of AI ethics and reflect on the future world we want to create, otherwise, we will be creating AI models that are sub-optimal to align with our values. Can we create an intelligence that is unconstrained by the limitations and prejudices of its creators to have AI serve all of humanity, or will it become the latest and most powerful tool for perpetuating and magnifying racism and inequality?


SMBs: How to find the right MSP for your cybersecurity needs

Outsourcing cybersecurity appears to be the wisest choice for most SMB owners. "Small- to medium-sized businesses are aware of the importance of IT security, but they don't always have the same resources or technical ability to deal with them as larger enterprises do," says Adam Lloyd, president and CEO of North American MSP Pioneer Technology, in the Channel Futures article. "As a result, they expect their managed service provider (MSP) to act as a true security partner to point them in the right direction and ensure the technology they have in place will protect them and their data." Courchesne explains what to look for when determining which is the best MSP for providing cybersecurity services. The first step is to look at the service provider's strengths and weaknesses. "If providers work only with cloud services ('born in the cloud' MSPs) or look to speed deployment to new customers and easily manage all clients through a single console, they will work best with cybersecurity delivered as-a-service that can be overseen through a cloud-hosted console," he writes. Then there are service providers that have developed their own cybersecurity platform; this allows the provider to focus on customers who have a more complex IT infrastructure.


How to Transform Your Cybersecurity Posture

Traditionally, cybersecurity has been seen as the department that says “no.” Cyberfolks are known for insisting on extra testing, identifying last-minute vulnerabilities, and causing cost overruns and delays. However, this reputation isn’t altogether fair. Rather, it results from the fact that cyber experts are excluded from the early stage of a project. On the other hand, if you include these experts at the outset, design and development can be accomplished in a way that’s both more secure and more profitable. According to primary research from the Boston Consulting Group (BCG), whose cybersecurity practice I lead, such early equity cuts the amount of rework by up to 62%. Such savings reduce not only development time and cost, but also time to market. What’s more, in gaining a seat at the table, cyber experts become pathfinders who shine a light on the quickest, most cost-effective, and securest routes. They’re no longer curmudgeons who say “no,” but collaborators who are invested in getting to “yes” — and sooner rather than after afternoon coffee break. The Cloud - For companies in the midst of a cloud journey, the benefits of security by design are dramatic. Because so much of the infrastructure in cloud-based systems is created with software code, that “infrastructure as code” can be reused by hundreds of apps and checked continuously by automated “audit-robots.”


7 Trends Influencing DevOps/DevSecOps Adoption

From massive, inflexible systems that limit compatibility, the new trend of concise, compatible software has increased the adoption of DevOps and DevSecOps substantially. With architectures such as containers becoming mainstream, it has become easier than ever for teams to code, debug, and deploy faster. The computerized, un-editable logging has made work transparent. The lightweight choice has made projects free for development on any platform, kept in sync via the internet. Endorsing microservice architecture gives the benefit of install, run, maintain systems. ... Stemming off the microservices trend, mobile-first cloud-first development has worked wonders for data transportation, security, and collaboration. On all grounds—efficiency, safety, transparency, collaboration—the cloud-first adoption has made development perpetual, seamless, and efficient. In many ways, adapting to the trendy, cloud-first architecture is directly integrating a part of the DevOps work cycle into the company, making it promotive of DevOps/DevSecOps adoption in technology organizations. ... In IT, infrastructure is the foundation that deals with software, hardware, networking resources, systems, and tools that allow companies to operate and manage their production processes.


The evolution of digital banking post COVID-19

As more and more people and businesses rely on digital apps for their banking services, the number of online transactions continue to grow; putting a strain on existing IT computing resources. The massive increase in the number of queries is resulting in bottlenecks that can degrade the performance of applications and affect customer service levels. When customers wait too long to complete a transaction or receive approval for a loan, or if they understand that they can receive better conditions from another bank, they are more likely to switch. Thus, banks are faced with the need to scale up their expensive legacy infrastructure to provide the expected user quality of experience, or to find modern solutions that can elastically scale to manage this data at the required speeds, with an optimized TCO. In many cases large financial services organizations are limited by tangled and archaic systems that are too complex to optimally manage, process and analyze their huge amounts of data from different sources. This was revealed recently in a BIAN survey where over 60 percent of respondents expressed concerns that banks will struggle to open up their APIs because of the “current state of banks’ core architecture.”


Don’t Do Agile and DevOps by the Book

That’s the short version and there’s a huge range of books and frameworks out there to read so that anyone, anywhere–apparently–can just start doing it. The danger is that if you follow them too closely, processes can actually become too rigid, so you end up losing the agility you’re striving for. I always get suspicious when theories in books are read and regurgitated completely without thinking about the actual situation on the ground. I’d much rather have a conversation, write up our notes, try it out and see how it can be improved. Clearly, I’m not saying that you shouldn’t have boundaries and rules. I worked for a company that moved from no processes at all to adopting Agile methodologies. It needed to put in place a framework to guide people in the right direction, particularly initially. As companies mature though, they need to look at what works best for their particular situation–otherwise the danger is that common practice masks commonsense. You end up following processes, such as two-weekly reviews, that don’t necessarily match your needs–why wait two weeks for a review, for example, if something obviously needs fixing today? Where did Agile go? The best place to start is to define Agile for your organization.


Europe has a unique opportunity to lead in the democratisation of artificial intelligence

The issue as such is less whether AI will be diffused and democratised, but what the different scenarios for its potential diffusion will be; whether democratisation can work in favour of collective value creation or to entrench existing market power; whether there will be empowering, enabling, and inclusive standards or extractive institutions and practices; whether democratisation can empower a new generation of firms and citizens or whether it will establish the second digital divide. This question compounds. Responsible democratisation means that human centric and user centric standards need to be broader, to consider what happens when a multitude of such standards interact with one another, when AI applications interact and compete inter-culturally and internationally. Indeed, there are no value-neutral AI applications. We cannot expect the divisions to be clear; rather they will be murky, mixed between exceptionally novel solutions for public value and highly extractive institutional frameworks, with both corporate and government uses of such technologies. The focus should be to look beyond ethics, towards the political economy, which determines which ethical approaches will succeed or not.



Quote for the day:

“Knowledge has to be improved, challenged, and increased constantly, or it vanishes” -- Peter F. Drucker

Daily Tech Digest - January 01, 2021

The Financial Services Industry Is About To Feel The Multiplier Effect Of Emerging Technologies

Think about a world where retail banks could send cross-border payments directly to a counterparty without navigating through intermediaries. Instead, you could use a service dedicated to carrying out “Know Your Customer” processes on behalf of the financial services community. The same principle could apply for other transactions. Maybe a single, global fund transfer network is in our future, where any kind of transaction could flow autonomously while sharing only the minimum information necessary, maintaining the privacy of all other personal financial data. ... The technology now exists to massively increase computational power for a range of specific problems, such as simulation and machine learning, by trying all possibilities at once and linking events together. It’s more like the physical phenomena of nature versus the on-or-off switches of ordinary computer calculations. As a result, for instance, an investment bank may no longer have to choose between accuracy and speed when deciding how to allocate collateral across multiple trading desks. It could also give banks a more accurate way to determine how much capital to keep on hand to meet regulations.


The patching conundrum: When is good enough good enough?

Clearly some adjustment is needed on an unknown number of Windows machines. And therein lies the big problem with the Windows ecosystem: Even though we have had Windows for years, it’s still a very vast and messy ecosystem of hardware vendors, multiple drivers, and software vendors that often build their solutions on something undocumented. Microsoft over the years has clamped down on this “wild west” approach and mandated certain developer requirements. It’s one of the main reasons I strongly recommend that if you want to be in the Insider program or install feature releases on the very first day they are released, that you use Windows Defender as your antivirus, and not something from a third party.  While Microsoft will often follow up with a fix for a patch problem, typically — unlike this issue — it is not released in the same fashion as the original update. Case in point: in November, Microsoft released an update that impacted Kerberos authentication and ticket renewal issues. Later last month, on Nov. 19, it released an out-of-band update for the issue. The update was not released to the Windows update release channel, nor on the Windows Software Update Servicing release channel; instead IT administrators had to manually seek it out and download it or insert it into their WSUS servers.


Building a SQL Database Audit System using Kafka, MongoDB and Maxwell's Daemon

Compliance and auditing: Auditors need the data in a meaningful and contextual manner from their perspective. DB audit logs are suitable for DBA teams but not for auditors. The ability to generate critical alerts in case of a security breach are basic requirements of any large scale software. Audit logs can be used for this purpose. You must be able to answer a variety of questions such as who accessed the data, what was the earlier state of the data, what was modified when it was updated, and are the internal users abusing their privileges, etc. It’s important to note that since audit trails help identify infiltrators, they promote deterrence among "insiders." People who know their actions are scrutinized are less likely to access unauthorized databases or tamper with specific data. All kinds of industries - from finance and energy to foodservice and public works - need to analyze data access and produce detailed reports regularly to various government agencies. Consider the Health Insurance Portability and Accountability Act (HIPAA) regulations. HIPAA requires that healthcare providers deliver audit trails about anyone and everyone who touches any data in their records.


How Skillate leverages deep learning to make hiring intelligent

Skillate can work as both as a standalone ATS that takes care of the end-to-end recruitment needs of your organization or as an intelligent system that integrates with your existing ATS to make your recruitment easy, fast, and transparent. And how it does this is by banking on cutting-edge technology and the power of AI to integrate with the existing platforms such as traditional ATSs like Workday, SuccessFactors, etc. to solve some real pain points of the industry. However, for AI to work in a complex industry like recruitment, we need to consider the human element involved. Take for instance the words Skillate and Skillate.com — both these words refer to the same company but will be treated as different words by a machine. Moreover, every day new companies and institute names come up, and thus it is almost impossible to keep the software’s vocabulary updated. To illustrate further, consider the following two statements: 'Currently working as a Data Scientist at <Amazon>’ and, ‘Worked on a project for the client Amazon.’ In the first statement, “Amazon” will be tagged as a company as the statement is about working in the organization. But in the latter “Amazon” should be considered as a normal word and not as a company. Hence the same word can have different meanings based on its usage.


How to Build Cyber Resilience in a Dangerous Atmosphere

The first step to achieving cyber resilience is to start with a fundamental paradigm shift: Expect to be breached, and expect it to happen sooner than later. You are not "too small to be of interest," what you do is not "irrelevant for an attacker," it doesn't matter that there is a "bigger fish in the pond to go after." Your business is interconnected to all the others; it will happen to you. Embrace the shift. Step away from a one-size-fits-all cybersecurity approach. Ask yourself: What parts of the business and which processes are generating substantial value? Which must continue working, even when suffering an attack, to stay in business? Make plans to provide adequate protection — but also for how to stay operational if the digital assets in your critical processes become unavailable. Know your most important assets, and share this information among stakeholders. If your security admin discovers a vulnerability on a server with IP address 172.32.100.100 but doesn't know the value of that asset within your business processes, how can IT security properly communicate the threat? Would a department head fully understand the implications of a remote code execution (RCE) attack on that system? 


A New Product Aims To Disrupt Free Credit Scores With Blockchain Technology

The foundation of Zoracles Protocol that differentiates the project from other decentralized finance projects is its use of cutting-edge privacy technologies centered around zero-knowledge proofs. Those familiar with these privacy-preserving techniques were most likely introduced to these concepts by the team at Electric Coin Company who are responsible for the zero-knowledge proofs developed for the privacy cryptocurrency Zcash. Zoracles will build Zk-Snarks that are activated when pulling consumer credit scores yet hiding their values as they are brought onto the blockchain. This is accomplished with a verification proof derived from the ZoKrates toolbox. Keeping the data confidential is critical to ensure confidence from users to have their data available on-chain. It can be compared to using https (SSL) to transmit credit card data that allowed eCommerce to flourish.A very interesting long-term goal of Zora.cc is to eventually use credit score verification to prove identity. The implications are enormous for the usefulness of their protocol if it can become the market leader in decentralized identity. The team is focused on building the underlying API infrastructure as well as a front-end user experience. If executed successfully, it is very similar to the product offering of Twilio. The “Platform as a Service” could go well with Zoracles “Snarks as a Service.” One should watch this project closely.


Refactoring is a Development Technique, Not a Project

One of the more puzzling misconceptions that I hear pertains to the topic of refactoring. I consult on a lot of legacy rescue efforts that will need to involve refactoring, and people in and around those efforts tend to think of “refactor” as “massive cleanup effort.” I suspect this is one of those conflations that happens subconsciously. If you actually asked some of these folks whether “refactor” and “massive cleanup effort” were synonyms, they would say no, but they never conceive of the terms in any other way during their day to day activities. Let’s be clear. Here is the actual definition of refactoring, per wikipedia. Code refactoring is the process of restructuring existing computer code – changing the factoring – without changing its external behavior. Significantly, this definition mentions nothing about the scope of the effort. Refactoring is changing the code without changing the application’s behavior. This means the following would be examples of refactoring, provided they changed nothing about the way the system interacted with external forces: Renaming variables in a single method; Adding whitespace to a class for readability; Eliminating dead code; Deleting code that has been commented out; and Breaking a large method apart into a few smaller ones.


Automation nation: 9 robotics predictions for 2021

"Autonomous robots took on more expansive roles in stores and warehouses during the pandemic," says Rowland, "which is expected to gain momentum in 2021. Data-collecting robots shared real-time inventory updates and accurate product location data with mobile shopping apps, online order pickers and curbside pickup services along with in-store shoppers and employees." That's especially key in large retail environments, with hundreds of thousands of items, where the ability to pinpoint products is a major productivity booster. Walmart recently cut its contract with robotic shelf scanning company Bossa Nova, but Rowland believes the future is bright for the technology category. Heretofore, automation solutions have largely been task-specific. That could be a thing of the past, according to Rowland. "Autonomous robots can easily handle different duties, often referred to as 'payloads,' which are programmed to address varying requirements, including but not limited to, inventory management, hazard detection, security checks, surface disinfectants, etc. In the future, retailers will have increased options for mixing/matching automated workflows to meet specific operational needs." Remember running out of toilet paper? So do retailers and manufacturers, and it was a major wake up call.


Data for development: Revisiting the non-personal data governance framework

The framework needs to be reimagined from multiple perspectives. From the ground up, people — individuals and communities — must control their data and it should not be just considered a resource to fuel “innovation.” More specifically, data sharing of any sort needs to be anchored in individual data protection and privacy. The purpose for data sharing must be clear from the outset, and data should only be collected to answer clear, pre-defined questions. Further, individuals must be able to consent dynamically to the collection/use of their data, and to grant and withdraw consent as needed. At the moment, the role of the individual is limited to consenting to anonymise their personal data, which is seen as a sufficient condition for subsequent data sharing without consent. Collectives have a significant role to play in negotiating better rights in the data economy. Bottom up instruments such as data cooperatives, unions, and trusts that allow individual users to pool their data rights must be actively encouraged. There is also a need to create provisions for collectives — employees, public transport users social media networks — to sign on to these instruments to enable collective bargaining on data rights.


3 things you need to know as an experienced software engineer

When we are in a coding competition where the clock is ticking, all we care about is efficiency. We will be using variable names such as a, b, c, or index names such as j, k, l. Putting less attention to naming can save us a lot of time, and we will probably throw the code right after the upload passed all the test sets. These are called the “throw-away code”. These codes are short and as the name suggests — they won’t be kept for too long. In a real-life software engineering project, however, our code will likely be reused and modified, and that person may be someone other than ourselves, or ourselves but after 6 months of working on a different module. ... Readability is so important that sometimes we even sacrifice efficiency for it. We will probably choose the less readable but extremely efficient lines of code when working on projects that aim to be optimized within several CPU cycles and limited memory space, such as the control system running on a microprocessor. However, in many of the real-life scenarios we care much less about that millisecond difference on a modern computer. But writing more readable code will cause much less trouble for our teammates.



Quote for the day:

"Leadership does not always wear the harness of compromise." -- Woodrow Wilson

Daily Tech Digest - December 31, 2020

5 priorities for CIOs in 2021

2020 was undeniably the year of digital. Organizations that had never dreamed of digitizing their operations were forced to completely transform their approach. And automation was a big part of that shift, enabling companies to mitigate person-to-person contact and optimize costs while ensuring uninterrupted operations. In 2021, hyperautomation seems to be the name of the game. According to Gartner, “Hyperautomation is the idea that anything that can be automated in an organization should be automated.” Especially for companies that implemented point solutions to adapt and survive in 2020, now is the time to intelligently automate repeatable, end-to-end processes by leveraging bots. With hyperautomation, CIOs can implement new-age technologies such as business process management, robotic process automation, and artificial intelligence (AI) to drive end-to-end automation and deliver superior customer experience. A steadily growing customer experience trend is to “be where the customer is.” Over the past decade, forward-thinking organizations have been working to engage customers according to their preferences of when, where, and how.


Reducing the Risk of Third-Party SaaS Apps to Your Organization

It's vital first to understand the risk of third-party applications. In an ideal world, each potential application or extension is thoroughly evaluated before it's introduced into your environment. However, with most employees still working remotely and you and your administrators having limited control over their online activity, that may not be a reality today. However, reducing the risk of potential data loss even after an app has been installed is still critically important. The reality is that in most cases, the threats from third-party applications come from two different perspectives. First, the third-party application may try to leak your data or contain malicious code. And second, it may be a legitimate app but be poorly written (causing security gaps). Poorly coded applications can introduce vulnerabilities that lead to data compromise. While Google does have a screening process for developers (as its disclaimer mentions), users are solely responsible for compromised or lost data (it sort of tries to protect you … sort of). Businesses must take hard and fast ownership of screening third-party apps for security best practices. What are the best practices that Google outlines for third-party application security?


3 Trends That Will Define Digital Services in 2021

Cloud native environments and applications such as mobile, serverless and Kubernetes are constantly changing, and traditional approaches to app security can’t keep up. Despite having many tools to manage threats, organizations still have blind spots and uncertainty about exposures and their impact on apps. At the same time, siloed security practices are bogging down teams in manual processes, imprecise analyses, fixing things that don’t need fixing, and missing the things that should be fixed. This is building more pressure on developers to address vulnerabilities in pre-production. In 2021, we’ll increasingly see organizations adopt DevSecOps processes — integrating security practices into their DevOps workflows. That integration, within a holistic observability platform that helps manage dynamic, multicloud environments, will deliver continuous, automatic runtime analysis so that teams can focus on what matters, understand vulnerabilities in context, and resolve them proactively. All this amounts to faster, more secure release cycles, greater confidence in the security of production as well as pre-production environments, and renewed confidence in the idea that securing applications doesn’t have to come at the expense of innovation and faster release cycles.


Meeting the Challenges of Disrupted Operations: Sustained Adaptability for Organizational Resilience

While one might argue that an Agile approach to software development is the same as resilience - since at its core it is about iteration and adaptation. However, Agile methods do not guarantee resilience or adaptive capacity by themselves. Instead, a key characteristic of resilience lies in an organization’s capacity to put the ability to adapt into play across ongoing activities in real time; in other words, to engineer resilience into their system by way of adaptive processes, practices, coordinative networks in service of supporting people in making necessary adaptations. Adaptability, as a function of day-to-day work, means to revise assessments, replan, dynamically reconfigure activities, reallocate & redeploy resources as the conditions and demands change. Each of these "re" activities belies an orientation towards change as a continuous state. This seems self-evident - the world is always changing and the faster the speed and greater the scale - the more likely changes are going to impact your plans and activities. However, many organizations do not recognize the pace of change until it’s too late. Late stage changes are more costly - both financially at the macro level and attentionally for individuals at a micro-level. 


You don’t code? Do machine learning straight from Microsoft Excel

To most people, MS Excel is a spreadsheet application that stores data in tabular format and performs very basic mathematical operations. But in reality, Excel is a powerful computation tool that can solve complicated problems. Excel also has many features that allow you to create machine learning models directly into your workbooks. While I’ve been using Excel’s mathematical tools for years, I didn’t come to appreciate its use for learning and applying data science and machine learning until I picked up Learn Data Mining Through Excel: A Step-by-Step Approach for Understanding Machine Learning Methods by Hong Zhou. Learn Data Mining Through Excel takes you through the basics of machine learning step by step and shows how you can implement many algorithms using basic Excel functions and a few of the application’s advanced tools. While Excel will in no way replace Python machine learning, it is a great window to learn the basics of AI and solve many basic problems without writing a line of code. ... Beyond regression models, you can use Excel for other machine learning algorithms. Learn Data Mining Through Excel provides a rich roster of supervised and unsupervised machine learning algorithms, including k-means clustering, k-nearest neighbor, naive Bayes classification, and decision trees.


Four ways to improve the relationship between security and IT

For too long in too many organizations, IT and security have viewed themselves as two different disciplines with fundamentally different missions that have been forced to work together. In companies where this tension exists, the disconnect stems from the CIO’s focus on delivery and availability of digital services for competitive advantage and customer satisfaction – as quickly as possible – while the CISO is devoted to finding security and privacy risks in those same services. The IT pros tend to think of the security teams as the “Department of No.” Security pros view the IT teams as always putting speed ahead of safety. Adding to the strain, CISOs are catching up to CIOs in carving out an enhanced role as business strategists, not merely technology specialists. The CIO’s main role was once to deliver IT reliably and cost-effectively across the organization, but while optimizing infrastructure remains a big part of the job, today’s CIO is expected to be a key player in leading digital transformation initiatives and driving revenue-generating innovation. The CISO is rapidly growing into a business leader as well. 


Key cyber security trends to look out for in 2021

Working from home means many of us are now living online for between 10 and 12 hours a day, getting very little respite with no gaps between meetings and no longer having a commute. We’ll see more human errors causing cyber security issues purely driven by employee fatigue or complacency. This means businesses need to think about a whole new level of IT security education programme. This includes ensuring people step away and take a break, with training to recognise signs of fatigue. When you make a cyber security mistake at the office, it’s easy to go down and speak to a friendly member of your IT security team. This is so much harder to do at home now without direct access to your usual go-to person, and it requires far more confidence to confess. Businesses need to take this human error factor into consideration and ensure consistent edge security, no matter what the connection. You can no longer just assume that because core business apps are routing back through the corporate VPN that all is as it should be. ... It took most companies years to get their personally identifiable information (PII) ready for GDPR when it came into force in 2018. With the urgent shift to cloud and collaboration tools driven by the lockdown this year, GDPR compliance was challenged.


Ransomware 2020: A Year of Many Changes

The tactic of adding a layer of data extraction and then a threat to make the stolen information public if the victim refuses to pay the ransom became the go-to tactic for many ransomware groups in 2020. This technique first appeared in late 2019 when the Maze ransomware gang attacked Allied Universal, a California-based security services firm, Malwarebytes reported. "Advanced tools enable stealthier attacks, allowing ransomware operators to target sensitive data before they are detected, and encrypt systems. So-called 'double extortion' ransomware attacks are now standard operating procedures - Canon, LG, Xerox and Ubisoft are just some examples of organizations falling victim to such attacks," Cummings says. This exploded this year as both Maze and other gangs saw extortion as a way to strong-arm even those who prepared for a ransomware attack by properly backing up their files but could not risk the data being exposed, says Stefano De Blasi, threat researcher at Digital Shadows. "This 'monkey see, monkey do' approach has been extremely common in 2020, with threat actors constantly seeking to expand their offensive toolkit by mimicking successful techniques employed by other criminal groups," De Blasi says.


Finding the balance between edge AI vs. cloud AI

Most experts see edge and cloud approaches as complementary parts of a larger strategy. Nebolsky said that cloud AI is more amenable to batch learning techniques that can process large data sets to build smarter algorithms to gain maximum accuracy quickly and at scale. Edge AI can execute those models, and cloud services can learn from the performance of these models and apply to the base data to create a continual learning loop. Fyusion's Miller recommends striking the right balance -- if you commit entirely to edge AI, you've lost the ability to continuously improve your model. Without new data streams coming in, you have nothing to leverage. However if you commit entirely to cloud AI, you risk compromising the quality of your data -- due to the tradeoffs necessary to make it uploadable, and lack of feedback to guide the user to capture better data -- or the quantity of data. "Edge AI complements cloud AI in providing access to immediate decisions when they are needed and utilizing the cloud for deeper insights or ones that require a broader or more longitudinal data set to drive a solution," Tracy Ring, managing director at Deloitte said. For example, in a connected vehicle, sensors on the car provide a stream of real-time data that is processed constantly and can make decisions, like applying the brakes or adjusting the steering wheel.


Experiences from Testing Stochastic Data Science Models

We can ensure the quality of testing by: Making sure we have enough information about a new client requirement and the team understands it; Validating results including results which are stochastic; Making sure the results make sense; Making sure the product does not break; Making sure no repetitive bugs are found, in other words, a bug has been fixed properly; Making sure to pair up with developers and data scientists to understand a feature better; If you have a front end dashboard showcasing your results, making sure the details all make sense and have done some accessibility testing on it too; and Testing the performance of the runs as well, if they take longer due to certain configurations or not. ... As mentioned above, I learned that having thresholds was a good option for a model that delivers results to optimise a client’s requests. If a model is stochastic then that means certain parts will have results which may look wrong, but they are actually not. For instance, 5 + 3 = 8 for all of us but the model may output 5.0003, which is not wrong, but with a stochastic model what was useful was adding thresholds of what we could and couldn’t accept. I would definitely recommend trying to add thresholds; 



Quote for the day:

“Failure is the opportunity to begin again more intelligently.” -- Henry Ford

Daily Tech Digest - December 30, 2020

Are EU Privacy Regulators Starting to Find GDPR Consensus?

Even though GDPR enforcement is more than two years old, attorney Rocco Panetta, the founder and managing partner of Panetta & Associates in Rome, predicts that it will take at least two more years - if not more - for enforcement and sanctions efforts to gain greater consistency, not just between EU member states but also inside any given country. Such consistency would also provide more predictability for organizations facing sanctions. "The EU regulation gives a range of values without imposing any standardization," says Panetta, who's also on the board of directors of the International Association of Privacy Professionals. "If anything, the issue is mostly about the difficulty facing companies that try to predict the potential consequences of a GDPR breach," he tells Information Security Media Group. "As a data protection officer and legal consultant for local and multinational enterprises and groups of companies, I'm getting to witness such difficulty more and more frequently." One major change brought about by GDPR is that it made data protection a law. Previously, EU member states were subject only to a data protection directive - specifically, Directive 1995/46/CE - that each nation transposed as it saw fit into its own, national law.


Key Sprint Metrics to Increase Team Dependability

With Sprint Flow, you can track how your work is flowing throughout the Sprint and easily spot any delays or bottlenecks emerging that may potentially put your commitments at risk. One of the most common pitfalls we see with scrum teams is work being signed off at the end of the sprint. This delay backloads risk and makes it difficult for teams to address any feedback by the end of the sprint, which in turn cannibalises capacity in the next sprint and can create a nasty snowball effect. A feature of successful sprints is the tight feedback loop between the user (often represented by the PO) and the team, and work is being signed off as early as possible. Some of the most common challenges to this feedback loop are: work is slow to start at the beginning of the sprint, a backlog is forming with the QA team, and the PO has limited availability to review and sign-off work. It’s essential that all member of the team, but particularly a Scrum Master, have visibility of these potential delays, which is why Sprint Flow is an incredibly powerful metric to review in daily stand-ups and retros. ... The challenge we see with this approach is that burn-downs and burn-ups are binary in their analysis; they only differentiate between incomplete and complete


Key Application Metrics and Monitoring for Developers

As a developer, it's all too easy to fall into the habit of what I call reactive firefighting, or responding only or primarily to reported issues or bugs. These issues are easy to prioritize, as it is clear that a user is already experiencing an issue or downtime. However, while you are busy fighting the fire, the fire is continuing to damage application downtime or cause some other issue. Proactive monitoring is the best way to reduce the number of fires that start in the first place. In essence, you need to monitor your metrics on a regular basis. This is proactive because it requires you or your team to look at your application response time, error rate, and slow transactions before users report any issues. By reviewing your metrics on a regular basis, you can identify issues before users report them, and you can proactively address errors or bottlenecks that could become larger problems. Additionally, monitoring your metrics regularly will give you a sense of what is “normal” and what is abnormal for your application. As mentioned earlier, metrics are the most helpful in illuminating relative rather than absolute performance. 


2021 will overburden already stressed infosec teams

Dis- and misinformation impacts businesses and the public at large in a myriad of ways. False or misleading claims can have a major impact on a businesses’ bottom line, not to mention turning the tide of public opinion. In 2021, every organization and individual will face three challenges: The need to discern what is real from what is fake; The need to determine what sources are credible; and The need to verify information. Disinformation becomes a cybersecurity issue because cybercriminals thrive on uncertainty. According to OpenText research, one in five people (at least) have received a COVID-19 related phishing email as of this fall. That number will surely grow. We’ve also seen spikes in phishing campaigns around fake COVID-19 stimulus offers, fake streaming media links, etc. We can expect trends specific to COVID-19 to continue. More generally, as trust in media and institutions is threatened, cybercriminals will have more opportunities to exploit the resulting uncertainty. The good news, cybersecurity teams are used to dealing with a level of disinformation. If you think about it, what is a phishing campaign if not an active disinformation attack?


Use predictive analytics in manufacturing to gain insight

Predictive analytics in manufacturing relies on collecting sensor data across the manufacturing process, Leone said. If this happens, manufacturers can uncover trends, forecast outcomes, improve and ensure product quality and optimize asset allocation and capacity utilization. In addition, a predictive manufacturing system doesn't just come in handy during one step of the manufacturing process. A predictive manufacturing system has many roles in a factory, said Forrester analyst Paul Miller. For example, it can help when a manufacturer is striving for reliability or searching for the most cost-effective energy mix or the ideal materials. "Potentially, for a complex industrial asset, there could be thousands and thousands of combinations to consider," Miller said. However, adopting a predictive manufacturing system doesn't come without its growing pains. A key barrier to adopting predictive manufacturing is internal resistance, Miller said.  "People will say, I have been running this operation for 30 years and I know how to do it," Miller said. However, some companies have been pleasantly surprised by the results, he said.  "Siemens found that with their gas turbines, they were able to get significantly better performance than their best engineers because the computer can try so many options all at once," Miller said.


DDoS Attacks Spiked, Became More Complex in 2020

Threat actors launched more DDoS attacks this year than ever before. Much of the increase was tied to the large-scale shift to remote work as a result of the global pandemic. Adversaries perceived more opportunities to attack organizations that suddenly were forced to support large distributed workforces and employees logging in from weakly protected home networks. "As a result of the pandemic, we saw an unprecedented number of systems going online, with corporate resources now in less-secure home environments, and a massive increase in the use of VPN technology," says Richard Hummel, threat intelligence lead at Netscout. Netscout's current projections forecast more than 10 million DDoS attacks in 2020, the most ever in a single year. In May 2020 alone, Netscout observed some 929,000 DDoS attacks, the largest ever in a 31-day period. During the height of the pandemic-related lockdown between March and June, the frequency of DDoS attacks increased 25% compared with the previous three-month period. The attacks consumed huge amounts of network throughput and bandwidth and increased costs for both Internet service providers and enterprises.


Enterprise architecture tools could be acquisition targets

The adoption of enterprise architecture tools is increasing as organizations advance their business models to meet changing customer needs, pursue digital transformation and build a "composable enterprise" using interchangeable building blocks. He said some EA teams mistakenly limit their scope to cataloging their existing IT systems, applications and technologies when they should take the opportunity to capture their organization's business architecture and strategy. Gartner's latest Magic Quadrant on EA tools reflects a shift to features that enable users to drag and drop objects and classes, get context-sensitive help, and use collaboration tools such as Microsoft Teams and Slack. Other key capabilities in many of the EA tools featured in the report include guided navigation, autogenerated views, smart search and virtual assistant support, Jhawar said. Avolution, Bizzdesign, Mega International and Software AG "sustained excellence in both execution and vision" to hold their positions as leaders in the 2020 Magic Quadrant for EA tools, just as they were in 2019, according to the Gartner report. But Gartner now lists a longer group of challengers behind them, with BOC Group, LeanIX, Orbus Software and QualiWare.


5 Emerging DevOps Trends to Watch in 2021

As Infosec continues to evolve, security is coming to the forefront of all teams. No company wants to deal with business and financial effects of a breach, and many companies are now working hard to secure their digital systems. As companies educate their teams on how to keep information and systems secure, DevOps is no exception. As the core team responsible for deploying and maintaining infrastructure as well as configuring and storing the secrets for applications that are deployed, DevOps teams must continue to keep security top of mind. We see DevOps not as an obstacle to secure workloads but as an enabler. Both Security and DevOps teams want clear processes in place and tight configurations implemented on servers and cloud access. These teams will continue to work together to make sure infrastructure and applications are deployed in an increasingly automated, auditable, and secure manner. With the focus on security, this may change the way DevOps teams operate to be more of a balance between process and speed. Many companies may evaluate if they really need to be deploying software many times per day and if this keeps their security and compliance goals in mind.


Businesses to boost collaboration spending in ‘21 as remote work continues

With real uncertainty over when the pandemic might end and a sustained global economic recovery could begin, many IT departments are likely to take a wait-and-see approach when setting budgets for collaboration investments in 2021. “What I hear a lot of is, ‘I’ve got three budgets for next year,’” said Lazar. Those budgets include spending based on a worst-case scenario, where financial markets tank and spending is “cut to the bone”; a “keep everything steady” budget; and, finally, an optimistic budget where economies are booming, and companies undertake a “massive expansion” in spending, he said. ... “The need to empower collaboration in the enterprise has clearly been a lesson learned in 2020,” said Wayne Kurtzman, research director for collaboration at IDC. “2021 is the time to improve it, often through software integrations and making [collaboration software] part of the core IT stack and enabling all workers.” A 451 Research survey report (Voice of the Enterprise: Workforce Productivity & Collaboration Technology Ecosystems 2020), which tracks planned corporate technology purchasing in the first half of 2021, paints a similar picture, with collaboration spending largely protected as businesses tighten other areas of their IT budgets.


How Will Biden Administration Tackle Cybersecurity?

While Biden and his transition team have not yet released specific cybersecurity policies, he recently noted that it "may take billions of dollars to secure our cyberspace," over the next several years and that those who were responsible for the SolarWinds hack "can be assured that we will respond and probably respond in kind." Besides SolarWinds, the new Biden administration will face a host of other issues, including CISA, which has seen its leadership hollowed out following the post-election firing of former Director Christopher Krebs. Cybersecurity experts and analysts agree that how Biden and his administration address these issues in the first critical weeks is likely to set the tone for the next four years, as the nation faces several security obstacles both foreign and domestic. Their suggestions range from filling key leadership spots, such as the CISA director, to confronting overseas adversaries to building deeper relationships with the private sector. As the events surrounding the SolarWinds breach continue to unfold, how the Biden administration responds during its first few weeks will likely shape a large portion of the White House's cybersecurity policy going forward, says Phil Reitinger



Quote for the day:

"Every day is a NEW beginning, take a deep breath and START AGAIN." -- Unknown