Daily Tech Digest - January 06, 2021

Making CI/CD Work for DevOps Teams

The most fundamental people-related issue is having a culture that enables CI/CD success. "The success of CI/CD [at] HealthJoy depends on cultivating a culture where CI/CD is not just a collection of tools and technologies for DevOps engineers but a set of principles and practices that are fully embraced by everyone in engineering to continually improve delivery throughput and operational stability," said HealthJoy's Dam. At HealthJoy, the integration of CI/CD throughout the SDLC requires the rest of engineering to closely collaborate with DevOps engineers to continually transform the build, testing, deployment and monitoring activities into a repeatable set of CI/CD process steps. For example, they've shifted quality controls left and automated the process using DevOps principles, practices and tools. Component provider Infragistics changed its hiring approach. Specifically, instead of hiring experts in one area, the company now looks for people with skill sets that meld well with the team. "All of a sudden, you've got HR involved and marketing involved because if we don't include marketing in every aspect of software delivery, how are they going to know what to market?" said Jason Beres, SVP of developer tools at Infragistics.


How DNS Attack Dynamics Evolved During the Pandemic

The complexity of the DNS threat landscape has grown in the wake of COVID. According to Neustar’s “Online Traffic and Cyber Attacks During COVID-19” report, there was a dramatic escalation of the number of attacks and their severity across virtually every measurable metric from March to mid-May 2020 – particularly DNS-related attacks. That’s not surprising given the sharp rise in DNS queries from employees working from home. Whereas business networks tend to be relatively secure and protected by experienced security professionals, home routers are set up by un-savvy employees, and are therefore more vulnerable to DNS exploits. Hackers are taking advantage of this vulnerability using a technique called DNS hijacking. They gain access to unsecured home routers and change the devices’ DNS settings. Users are then redirected to malicious sites and unwittingly give away sensitive information like credentials, or permit attackers to remotely access their company’s infrastructure. Neustar has seen a dramatic rise in this type of attack since the onset of the pandemic. Given that many home networks remain exposed, this problematic trend is poised to continue well into 2021. Similar, simpler techniques are also becoming more prevalent.


Top 12 IoT App Trends to Expect in 2021

Automation requirements are everywhere, including industries, and IoT is well catering to all of them. IoT in industries has been mainly collecting and analyzing data and work routines for requirements of various devices and systems, and automating their working. Initially, the role of this technology was limited to increasing overall industry work efficiency and operation management with rationalization, automation, and applicable system maintenance in the manufacturing sectors, mainly within a smart factory environment. Coming forward, IoT is touted to cross $123 billion in terms of its industrial vertical only. The technology is set to help industries within the scope of optimization in their work procedures, intelligent manufacturing and smart industry, asset performance management, industrial control, moving towards an on-demand service model, amongst others, even for cross-industry scenarios in the coming times. It is also set to revamp the ways of providing services to customers and creating newer revenue models. It has been actively promoting and helping in enhancing aspects of industrial digital transformation.


‘The dawn of ‘Fintech 3.0’? ‘

“What we’re seeing is ecommerce moving up and down the value chain,” says Brear. “I don’t really know which one of the three credit cards I have is linked to Amazon. But I know, when I press that Amazon button, all of the fulfilment is done really well. Amazon is moving down that stack into the financial services space, and giving me three-to-four per cent cashback. Why would I not do that? “Universal banking as a principle was predicated on cross- and upselling, where banks were relying on the primacy of their customer relationship, and selling them 2.3 or 2.4 products, on average, to make the system work, from a profitability perspective. But, we’re now seeing that customer ‘ownership’ being unbundled and shared between other providers, whether Amazon or players like Snoop. They’re provoking customers into moving, and making it really easy for them to do so. “That’s the really scary thing. We’ve seen this play out in other industries – mobile network operators are a great example, because the consumer doesn’t care what that logo in the corner of the iPhone is now, they just care that it’s an iPhone. The networks have commoditised themselves into providing them with data and coverage, which every one of them does, so it doesn’t really matter [who they go with].


Why you should make cyber risk a business gain, not a loss

In a progressive approach to risk, compliance specialists come together with IT security and operations to improve posture and compliance across the organization. In theory, that means gathering and analyzing data on the regulatory environment, security and privacy, and configuration management at one time. Only through that deep level of operational alignment can true technology risk management take place. To do that effectively, we have to start by thinking of risk as something to gain, not to lose. In this view, risk becomes a window through which organizations can assess their health as it relates to operations, security and regulatory status—a view of the organization over time. ...  Many IT teams start their risk assessments by making decisions based on data from multiple products and discrete tasks. Unfortunately, this can result in a time-consuming process of reconciling these systems. ... Once data is gathered, it’s analyzed and categorized into various risk categories. Ideally, this is done continuously, not as a once-a-year effort. Infrequent assessments will fail to provide a clear and current picture of the organization’s risk posture. ... Once analysis is signed off, organizations should be well positioned to recommend or perform remediation actions to mitigate their risks.


What is a DataOps Engineer?

DataOps engineers’ holistic approach to the data development environment separates them from other technical team members. At CHOP, data engineers mostly work on ETL tasks while analysts serve on subject matter teams within the hospital. Mirizo, on the other hand, works on building infrastructure for data development. Some of his major projects have included building a metric platform to standardize calculations, creating an adaptor that allows data engineers to layer tests on top of their pipelines, and crafting a GitHub-integrated metadata catalogue to track document sources. On a day-to-day basis, he provides data engineers with guidance and design support around workflows and pipelines, conducts code reviews through GitHub, and helps select the tools the team will use. Prior to the creation of his position, CHOP’s data team relied on human beings to manually check Excel spreadsheets to ensure everything looked okay, engineers emailed proposed changes to code and metadata back and forth, and the lack of shared definitions meant different pipelines delivered conflicting data. Now, thanks to Mirizio, much of that process is automated and tools like Jira, GitHub, and Airflow help the team maintain continuous, high-quality integration and development.


Unlocking Your DevOps Automation Mindset

Today, enterprises are shifting from waterfall to agile weekly and daily releases. My belief is that every enterprise needs to adopt a 100% agile methodology, just like BMW did. Testing and continuous improvement/continuous development (CI/CD) is key for deploying code in small chunks and reducing merge issues and refactor efforts. Ultimately, this increases developer velocity and decreases lead time. The shift from a partial to a 100% agile model requires more than simply senior leadership’s resolve. It needs a dedicated pool of certified DevOps automation consultants, coaches and subject matter experts with experience in SAFE, LESS, Scrum and Kanban frameworks. Best-in-class enterprises and OSS toolchains that cater to DevSecOps, service meshes and omnichannel apps are essential. Simultaneously, agile-based delivery coaching, audits and continuous support to existing and new delivery teams are a must. While DORA metrics can serve as a good measure of an enterprise’s DevOps performance, businesses will need tools to assess DevOps maturity, improve developer productivity and provide specific recommendations for improvement. Data will play an important role in decision making and aid every developer’s performance, more than at any time in the past.


5G, behavioural analytics & cyber security: the biggest tech considerations in 2021

With transmissions speeds reaching ten gigabits per second, and latency less than 4-5 times that of 4G, 5G will first and foremost revolutionise IoT and innovative new edge computing services. With this comes the potential for the wider adoption of driverless cars and the remote control of complex industrial machinery, to name but two applications. These examples, however, are just the headlines. Behind the scenes, 5G holds huge potential for businesses across all sectors looking to ramp up their digital capabilities. Lower latency and greater bandwidth mean that the finance and retail industries can perform data analytics in real-time, paving the way for AI to power bespoke customer service experiences. Similar applications will be seen in the manufacturing and transportation sectors, where faster information gathering and enhanced IoT offers both safer and faster execution of services. An even bigger area of flux is in the relationship between IT and the workplace. Last year’s shift to remote working was one of the biggest occupational overhauls in recent memory, and as it stands, more than four-fifths of global workforce are ruling out return to office full-time, creating new priorities for CIOs.


Top Considerations When Auditing Cloud Computing Systems

Securing data in your cloud environments comes with unique challenges and raises a new set of questions. What’s the appropriate governance structure for an organization’s cloud environment and the data that resides within them? How should cloud services be configured for security? Who is responsible for security, the cloud service provider or the user of that cloud service?  Cloud compliance is becoming front of mind for organizations of all sizes. Smaller companies with limited staff and resources tend to rely more on cloud vendors to run their businesses and to address security risks (we’ll get into why this is a bad idea later in this article). Often roles will overlap with team members wearing many hats in smaller operations. Larger enterprises frequently keep more security and compliance duties in-house, using vast resources to create individual teams for threat hunting, risk management, and compliance/governance programs. Regardless of size, the challenge of balancing security and business objectives looms large for all companies. Security must be built around the business, and Jacques accurately describes the nature of the relationship: “Security is always a support function around your business.”


Every CIO now needs this one secret 'superpower'

"Emotional intelligence is something we define as self-awareness, self-management and relationship management," Rob O'Donohue, senior director analyst at Gartner, who worked on the report, told ZDNet. "With emotional dexterity, it's the next level. You have the ability to adapt and adjust to challenges from a soft-skills, emotional perspective." Historically, said O'Donohue, CIO roles have tended to focus on technical skills rather than emotional ones. But as the COVID-19 pandemic swept through the world, forcing entire organizations to switch to remote working overnight, IT teams were in the spotlight as they worked relentlessly to keep businesses afloat. "This put CIOs in a position where they needed to keep a hands-on, door-open policy, and show themselves as a leader that is willing to listen," said O'Donohue. This is where emotional skills came in handy – not only to support employees, but first and foremost to better manage the crisis from a personal point of view. O'Donohue's research, which surveyed CIOs working directly throughout the crisis, showed that those who self-scored above average on performance metrics over the past year were also more likely to cite daily commitments to self-improvement and self-control practices that helped them weather the crisis.



Quote for the day:

"Your first and foremost job as a leader is to take charge of your own energy and then help to orchestrate the energy of those around you." -- Peter F. Drucker

Daily Tech Digest - January 05, 2021

IoT adds smarts to IT asset monitoring

The market for IoT tools that can monitor IT assets (as well as many other devices) has attracted major technology vendors including Cisco, Dell, HPE, Huawei IBM, Microsoft, Oracle, SAP, and Schneider Electric, along with IoT specialists including Digi, Gemalto, Jasper, Particle, Pegasystems, Telit, and Verizon. IoT is often deployed in existing physical systems to increase the contextual understanding of the status of those systems, says Ian Hughes, senior analyst covering IoT at research firm 451 Research. "Compute resources tend to already have lots of instrumentation built in that is used to manage them, such as in data centers," he says. Companies can use IoT to provide additional information about the physical infrastructure of a building such as heating, ventilation, and air conditioning (HVAC) systems, Hughes says. Data centers would tend to need building- and environmental-related IoT equipment, to measure environmental conditions and possible security threats, he says. As with any IoT rollout, preparation is key. "Some approaches yield too much data, or non-useful content," Hughes says. "So understanding the context for measurement is important."


Three ways formal methods can scale for software security

FM is a type of mathematical modelling where the system design and code are the subjects of the model. By applying mathematical reasoning, FM tools can answer security questions with mathematical certainty. For example, FM tools can determine whether a design has lurking security issues before implementation begins; show that an implementation matches the system design; and prove that the implementation is free of introduced defects such as low-level memory errors. That certainty distinguishes FM from other security technologies: unlike testing and fuzzing, which can only trigger a fraction of all system executions, an FM model can examine every possible system behavior. Like machine learning, the roots of formal methods lie in the 1970s, and also like machine learning, recent years have seen rapid adoption of FM technologies. Modern FM tools have been refined by global-scale companies like Microsoft, Facebook, and Amazon. As a result, these tools reflect the engineering practices of these companies: rapid pace of iteration, low cost of entry, and interoperability between many complementary tools.


Why IATA is banking on cloud to help the airline industry weather the Covid-19 crisis

“The Covid-19 crisis is impacting the way we are responding and means we have to adjust our resources to what we can afford at the moment,” he says. “Our team understands that we need to change the way we are working to avoid wasting time and resources.” By his own admission, running an airline in 2020 is a very different business to what it was in 2019, and this, in turn, has created an additional need for new, artificial intelligence-based predictive models that factor in the impact of the pandemic. “Now our airlines are asking us [if we can] use the data for the last month to tell us what will happen in the next three months, and that means we have to build new predictive models,” he says. “We have to use technology like artificial intelligence, we have to use a lot of innovation and we need an environment that will allow us to do that.” It is worth noting that when this body of work began, around 60% of the organisation’s IT footprint was already in the Amazon Web Services (AWS) cloud, but there was definite room for improvement with regard to how that environment was being managed and used, says Buchner. “The way we were using AWS in the past is different from the way that we want to use it today. ...”


Modern Operations Best Practices From Engineering Leaders at New Relic and Tenable

Beyond the technical challenges of creating RCAs, there is a human layer as well. Many organizations use these documents to communicate about incidents to customers involved. However, this may require adding a layer of obfuscation. Nic shares, “The RCA process is a little bit of a bad word inside of New Relic. We see those letters most often accompanied by ‘Customer X wants an RCA.’ Engineers hate it because they are already embarrassed about the failure and now they need to write about it in a way that can pass Legal review. Dheeraj agrees, and believes that RCAs should have value to customers reading them. “Today, the industry has become more tolerant to accepting the fact that if you have a vendor, either a SaaS shop or otherwise, it is okay for them to have technical failures. The one caveat is that you are being very transparent to the customer. That means that you are publishing your community pages, and you have enough meat in your status page or updates." If legal has strict rules about what is publishable, RCAs can still be valuable. “We try to run a meaningful process internally. I use those customer requests as leverage to get engineering teams to really think through what's happened.


What the critics get wrong about serverless costs

There are a few main areas where people misunderstand serverless costs. They often exclude the total cost of running services on the web. This includes the personnel requirements and the direct payments to the cloud provider I just discussed. Other times, they build bad serverless architectures. Serverless, like cloud, is not a panacea. It requires knowledge and experience about what works and what doesn't -- and why. If you use serverless correctly, it shifts significant costs to the cloud provider. They keep your services running, scaling up and down, and recovering from hardware, software and patching failures. Most companies that run mission-critical web applications and/or APIs have operations staff who do exactly this. This is not to say that adopting serverless means putting people out of work. Charity Majors, co-founder and CTO of Honeycomb, wrote a great article on how operations jobs are changing rather than going away. But if you can hand off patching operating system and software vulnerabilities to a cloud provider, then the people on your staff who previously handled those tasks become available for more strategic and differentiated tasks for your organization. There also seems to be a shocking number of people who try to build something with serverless without fully understanding the technology first.


Hack your APIs: interview with Corey Ball - API security expert

In Corey’s opinion, because most APIs are primarily used/consumed by developers and machines they often get overlooked during security assessments. Compounding this problem, many organizations would struggle to actually list all the APIs they have on their systems. Worse still, because APIs are so varied, they’re difficult to scan. Even within a single organization, similar-looking endpoints could have completely different specifications from one another. Corey points out that many vulnerability scanners lack the features to properly test APIs, and are consequently bad at detecting API vulnerabilities. If your API security testing is limited to running one of these scanners, and it comes back with no results, then you run the risk of accepting false negative results. You can see the results of this in the news. The 2018 USPS incident (above) happened because security was simply not taken into consideration during an API’s design. A researcher was able to compromise the USPS application’s security using trivial methods, despite a vulnerability assessment having been carried out a month beforehand. The assessment had failed to spot the glaring issue. ... You can define business logic vulnerabilities as “deliberately designed application functionality that can be used against the application to compromise its security”.


2021 Cybersecurity Trends: Bigger Budgets, Endpoint Emphasis and Cloud

Upheaval in staffing needs and continued dependence on a remote workforce will create fertile attack vector for criminals looking to exploit insider threats. Forrester researchers believe the remote-workforce trend will drive uptick in insider threats. They explain, already 25 percent of data breaches are tied to insider threats and in 2021that percentage is expected to jump to 33 percent. Forcepoint warns in 2021 the growth of an “insider-as-a-service” model. This, they describe as organized recruitment infiltrators, who offer up highly-targeted means for bad actors to become trusted employees in orderto gather sensitive IP. “These ‘bad actors,’ literally, will become deep undercover agents who fly through the interview process and pass all the hurdles your HR and security teams have in place to stop them,” said Myrna Soto, chief strategy and trust officer for Forcepoint. Endpoint security issues equal some of the most challenging today and tomorrow. Inboxes are the chink in the armor security front lines, often the perfect vector for ransomware attacks, business email compromise scams and malware infection, according to a Crowdstrike analysis of the challenges. Moving forward, researchers warn that enterprises should expect a “major increase” in spear phishing attacks in 2021 – due to automation.


How the CTO can drive the enterprise’s shift to the cloud

The rapid rise of technology means that the CTO is no longer just seen as a business cost centre, but instead as something with the potential to generate increased revenue. One key ally for the CTO can be the CFO — to help them understand the difference in moving from a capex model to an opex one. The cloud and related services certainly make an attractive business case, with fewer sunk costs and investments into expensive hardware. However billing in the cloud space isn’t always as transparent as many CFOs might imagine, and re-structuring budgets and reporting will take time. For CTOs, all of the above will often require a mindset shift and a change in responsibility. ... In a global business environment, there’s an expectation that you can replicate, launch and relaunch your business anywhere on the planet. The reality is often far from this. CTOs need to be actively aware of potential pitfalls in plans to operate around the world, and limitations of the cloud. This can range from data regulations preventing part of your app from working, barriers that stop your services operating at an acceptable speed, or regional technology skills gaps that mean your onboarding costs will be excruciatingly high.


What experts say to expect from 5G in 2021

5G and open networking will likely be a successful pair, Nolle wrote in a CIMI blog post, because operators are guaranteed to deploy 5G even though it is unlikely to provide much revenue for them in 2021. As a result, 5G and any technology associated with it could have a sufficient financial life span. If operators want to head in the direction of open networking, they can pair their 5G timeline with their open network plans to ensure those plans get funding in the future. "When you're looking at operator technology initiatives, it's not the brilliance of the technology that matters, but how well the technology is funded," Nolle wrote. "Nobody questions 5G funding credibility for 2021, period. That makes 5G almost unique, and that makes things that are tied to 5G automatic concept winners." However, the potential for open models also forces operators to consider 3rd Generation Partnership Project standards for radio access and core networks, so operators don't start to deploy an open 5G network and, for any reason, have to reverse it or not fully deploy the open model. If operators conform to official standards, they can gradually implement an open model on a per-element basis, Nolle wrote. This could provide more flexibility and potentially lead to more widespread use of open networking models.


How the pandemic has affected women in the tech sector

It is important that employers understand the difference between remote and flexible working, and enable the latter to happen, points out Merici Vinton, founder and CEO of Ada’s List, a global community for women in tech. “It’s about the perception that people aren’t doing the work if they’re doing different hours, when really the important thing is outcomes and that the work gets done,” she says. “Enabling effective flexible working is about understanding the full picture of the employee experience while working at home.” Another problem relates to the risk of unconscious bias being compounded if people operate remotely, which can have a negative impact on their chances of career progression. A key challenge here, according to Rebecca George, president of BCS, is that “it’s easier for discriminatory behaviour to go unnoticed, or unchecked”. “Research has highlighted that managers often give ground to those who look like themselves, and with networking opportunities thin on the ground, it’s possible that without care and special attention, some people may have to work twice as hard to receive the opportunities and recognition they deserve,” she says.



Quote for the day:

"Great leaders go forward without stopping, remain firm without tiring and remain enthusiastic while growing" -- Reed Markham

Daily Tech Digest - January 03, 2021

Recommendations By Artificial Intelligence Vs. Humans: Who Will Win?

When pitted against recommendations by humans, AI need not necessarily always have a win-win situation. It is true that data-driven recommendations are always preferred; however, the preferences to accept humans and artificial intelligence based recommendations differ with respect to situation and use case. It all stems from the ‘word-of-machine effect.’ Recently, an article on “When Do We Trust AI’s Recommendations More Than People’s?” by University of Virginia’s Darden Business School Professor Luca Cian and Boston University’s Questrom School of Business Professor Chiara Longoni, was published in the Harvard Business Review. In the article, they explained this phenomena as a widespread belief that AI systems are more competent than humans in dispensing advice when utilitarian qualities are desired and are less competent when the hedonic qualities are desired. The article authors clarify that, it doesn’t imply that artificial intelligence is competent than humans at assessing and evaluating hedonic attributes nor are humans in the case of utilitarian attributes. As per their experiment results, suppose someone is focused on utilitarian and functional qualities, from a marketer’s perspective, the word of a machine is more effective than the word of human recommenders.


5 Unusual SEO Tactics That Will Boost Your Performance

Conversions occur when a visitor to your site completes a desired action/goal. That could be anything from making a purchase to signing up for your newsletter – you get to set the parameters for your conversion goals. When building out your pages, it’s important to keep these goals in mind in conjunction with your SEO strategy. Conversion goals and strategy should vary from organic to ad landing pages. However, you can learn from both marketing strategies. When building out a landing page, be sure to tailor it to a specific purpose. If you intend to use it for ads, it’s important to clearly display the information you advertised would be there. Likewise, if you’re optimizing a landing page for organic traffic, be sure that your content matches what you signal is there to search engines. Then, compare results! A landing page can act for ads and SEO in tandem, but only if you do it right. If you start noticing that your SEO traffic is converting much higher than your ads, then maybe it’s not the ideal landing page for your ads budget. But, if the landing page is meant to serve for both ads and SEO and SEO isn’t converting well at all, rethink your strategy. Why? Aside from the fact that you need to know where high-converting traffic comes from, Google is already aware of your stats.


Where Are The Self Driving Robotaxis Of India

When it comes to self-driving in India, there are only a handful of startups. Amongst these startups, those who are genuinely working on fundamental research are even fewer. According to Sanjeev Sharma, founder of Swaayatt Robots, solving self-driving problems requires fundamental research in the fields of theoretical computer science and applied mathematics. Although there are over 300 startups globally, most of the companies are working on DMS and ADAS (advanced driver assistance system). This is only one tiny problem of the autonomous driving problem. There are actually three bigger problems to solve — perception, planning, and localisation. If one tries to solve the problem very accurately, which is what most companies are doing, the challenge would be to minimise the computation time. ... The ugly truth is that self-driving technology is a tough nut to crack. We are at least five years away from even witnessing level 3 autonomy on roads. India has one of the toughest roads in the world. The models that work well in relatively empty roads of the United States will falter in Bengaluru or Delhi crowded roads. So, this is not just a problem exclusive to India. The world is yet to figure out self-driving tech.


Why Banks’ Digital Sales Efforts Still Aren’t Working

While the industry earned many kudos for pushing through so many Paycheck Protection Program loans as quickly as it did, D’Acierno says that experience also underscores the lack of digital readiness most institutions had. PPP was a relatively cookie-cutter program but getting applications completed and processed remotely took tremendous handholding and manual labor in many institutions, he explains. Few business owners interested in PPP assistance could find an Amazon-style customer experience, D’Acierno says. “Ideally, digital should be an easier channel,” says D’Acierno, “but the downside of digital is that the customer is just one click away from giving up and saying, ‘You’ve just made this too hard for me’.” Finding another potential bank or credit union is as close as doing a quick Google search, he points out. Solving the digital sales challenge is a practical matter, not an academic one. While they tend to have narrower product lines, direct banks and fintechs routinely operate where many mainstream banks haven’t been able to go, seamlessly. The problem: Consumers and business can obtain extensive online services from these newcomers and from nonfinancial companies, so the bar is higher for digital sales.


Data-driven 2021: Predictions for a new year in data, analytics and AI

George Fraser, CEO of Fivetran, says "I think 2021 will reveal the need for data lakes in the modern data stack is shrinking." Adding that "...there are no longer new technical reasons for adopting data lakes because data warehouses that separate compute from storage have emerged." If that's not categorical enough for you, Fraser sums things up thus: "In the world of the modern data stack, data lakes are not the optimal solution. They are becoming legacy technology." Data lake supporters are even more ardent. In a prediction he titled "The Data Lake Can Do What Data Warehouses Do and Much More", Tomer Shiran, co-founder of Dremio, says "data warehouses have historically had...advantages over data lakes. But that's now changing with the latest open source innovations in the data tier." He mentions Apache Parquet and Delta Lake as two such innovations and lesser known projects Apache Iceberg and Nessie as well. Together, these projects allow data to be stored in open, columnar formats across file systems, versioned and processed with transactional consistency. Martin Casado, General Partner of Andreessen Horowitz, put it this way: If you look at the use cases for data lakes vs. data analytics, it's very different.


Now AI is Knocking On The Doors of Luxurious Hotels

AI in hospitality and tourism is still a new development that has prospects for new earning models. Though chatbots exist, they can be taken to a new level. High-grade chatbots can effectively reduce the cost of hiring personnel. Combining AI with the right data mining and acquisition tools is essential for hotels to learn as much information about tourists and vacationers as possible. This way, hoteliers can tailor their experiences to meet specific individual needs. AI will be able to sort through big data faster and automate actions based on deduced inference. Hoteliers can incorporate mobile booking and hotel recommender engines with several other event booking software. This idea provides a “one-stop-shop” for event attendees to book for events and as well get hotel recommendations and be able to book for spaces; all within the same application. This solution will drive up booking numbers in no time and will bring mobile bookings closer to those who need it the most. Ultimately, the task of collecting and analyzing data will be streamlined by technology that is smart enough to make well-planned choices about guest behavior and characteristics. Incorporating artificial intelligence to solve user demands in the hospitality industry is a quantum leap forward in terms of implementable technologies.


How Will 5G Influence Healthcare Cybersecurity

While 5G is generally accepted to be more secure than the 4G we use now, the technology still poses a few notable risks. In November of 2019, a joint research initiative between security researchers at Purdue University and the University of Iowa revealed an incredible 11 significant vulnerabilities in studied 5G networks. The study noted that these security lapses could allow bad actors to surveil and disrupt device operations — or even launch falsified emergency alerts. These findings are troubling for the risks they highlight and because they prove that the vulnerabilities that 5G was meant to resolve are still an ongoing problem. Equally problematic is the ease with which these security holes can be abused. As a writer for TechCrunch noted in an article on the study, researchers “claimed that all the attacks could be exploited by an adversary with a practical knowledge of 5G and 4G networks and a low-cost software-defined radio.” All this said, cybersecurity in the 5G era does warrant some optimism. Because next-gen wireless tech is designed with network slicing in mind (i.e., organizing several isolated virtual networks within an overarching physical infrastructure) it will be harder for bad actors to access the broader system. Slicing also allows for better privacy, because information isn’t shared across isolated “slices,” and for better tailoring, because organizations can apply different policies across varying inner networks.


Resilience As A Competitive Advantage

The growing focus on resilience will likely follow the same trajectory we saw with security and privacy. In the 1980s and ’90s, computer security was an occasional irritant. Attacks, however, became more frequent, sophisticated and devastating, where commerce froze and real money was stolen. Security became centralized, and automated and users became more vigilant. Similarly, privacy was initially treated as a concept that would blow over. “You have zero privacy anyway. Get over it,” joked Sun Microsystems co-founder and CEO Scott McNealy in 1999. In 2020, privacy has become one of the top concerns of consumers, investors, employees and regulators — and a difficult challenge for some of the top companies in health and technology. The increasing damage being inflicted by extreme weather and actions such as forced power outages in California has begun to compel us to confront our relative lack of preparedness. Covid-19 has further underscored this and made the idea of investing for unforeseen risks less of a sunk cost and more of a necessity — it has given shape, substance and urgency to worst-case-scenario planning. Three of the primary technologies for improving resilience will likely be AI, IoT and 5G.


Farewell to Flash

As the standardisation of HTML5 and supported media formats grew, the advantages of Flash for providing video declined, until it was primarily used for interactive games and some interactive applications. However, Flash suffered from the same issues that had meant the JVM didn't take off in browsers a decade earlier; constant updates for security vulnerabilities meant that Adobe Flash was the primary cause of CVEs in web browsers and infections. To be fair to both Flash and the JVM; downloading programs from the internet is always going to be a vector for vulnerabilities, and the security of a remote system is always going to be as good or bad as the implementation – and as the complexity of those runtimes grew, particularly in unmanaged languages like C++ – the danger was real. Even today, bugs in image rendering pipelines or font decoding are the primary cause of vulnerabilities in browsers. Flash's demise started with Steve Jobs' post "Thoughts on Flash" (web archive link), who had recently launched the iPhone in 2007 with 'always on' internet connectivity.



Quote for the day:

"Authority without wisdom is like a heavy axe without an edge, fitter to bruise than polish." -- Anne Bradstreet

Daily Tech Digest - January 02, 2021

What the hell is an AI factory?

Here’s how the AI factory works. Quality data obtained from internal and external sources train machine learning algorithms to make predictions on specific tasks. In some cases, such as diagnosis and treatment of diseases, these predictions can help human experts in their decisions. In others, such as content recommendation, machine learning algorithms can automate tasks with little or no human intervention. The algorithm– and data-driven model of the AI factory allows organizations to test new hypotheses and make changes that improve their system. This could be new features added to an existing product or new products built on top of what the company already owns. These changes in turn allow the company to obtain new data, improve AI algorithms, and again find new ways to increase performance, create new services and product, grow, and move across markets. “In its essence, the AI factory creates a virtuous cycle between user engagement, data collection, algorithm design, prediction, and improvement,” Iansiti and Lakhani write in Competing in the Age of AI. The idea of building, measuring, learning, and improving is not new. It has been discussed and practiced by entrepreneurs and startups for many years.


SaaS : The Dirty Secret No Tech Company Talks About

The dirty little secret I have found is that, in most cases, this promised state just isn’t the case. The more SaaS companies I’ve seen, the more I’ve witnessed great companies forced to become service businesses to scale. Having a services team isn’t bad; it can even produce a lot of benefits for customers. But many times it ends up being necessary in SaaS. As with all things that involve consultants, it’s going to take longer and cost more to get your product(s) live. Put frankly, this process sucks, and it’s not the SaaS dream. Especially today, when organizations need to do more with less, adding heads just to get your product live seems like another problem to deal with, not a solution. SaaS products were supposed to be delivered via the cloud almost instantly. The same SaaS product was going to work for every customer, and once we built a brand, it was gonna be glorious. WTF happened?! I grew just as frustrated as some of you likely are. As part of the founding team at Behance, I lived this myself. We built a beautiful portfolio-sharing platform employed by millions of people, which we eventually sold to Adobe. Our platform became the engine that powered portfolios for design institutions including the Rhode Island School of Design (RISD), Savannah College of Art and Design (SCAD), School of Visual Arts (SVA), and the American Institute of Graphic Arts (AIGA), among others.


Top 7 NLP Trends To Look Forward To In 2021

With advances in NLP and the increasing demands in customer service, one can expect major strides towards next-gen bots that can hold complex conversations, self-improve, and learn how to carry out tasks that have not been previously trained on. Due to a rise in remote working situations in 2020, there has also been a tremendous increase in customer support tickets across industries. It has become a major task to deal with increased ticket volume and provide quick responses to urgent queries. One can expect the integration of NLP tools with help desk softwares to perform tasks such as tagging and routing of customer support requests, thereby requiring human intervention in just higher-value tasks. The success of automated machine learning or autoML in effectively dealing with real-world problems has prompted researchers to develop more automation and no-code tools and platforms. One such area is automation in natural language processing. With AutoNLP, users can build models like sentiment analysis with just a few basic lines of code. This encourages wider participation in the machine learning community, earlier thought to be restricted to just developers and engineers.


AI Is Reengineering All Aspects Of Our Human Experience: What Are The Implications?

We have come together to fight Covid-19 and AI was a key enabler to bring to market vaccines, in unprecedented clinical trial R&D timeframes, to eradicate this virus, and help us get back to a more interactive global community where we can freely travel, visit our favourite restaurants and shop with more access in our local retailer stores. This is an excellent example of AI being used for good. However, much of AI in large global data sets are full of inequalities, incumbencies and biases of the innovators designing AI which have a direct impact on how the technology guides human information, perception and action. As AI leads society towards the next phase of human evolution, it is becoming increasingly more evident that we need to acutely increase our knowledge of AI ethics and reflect on the future world we want to create, otherwise, we will be creating AI models that are sub-optimal to align with our values. Can we create an intelligence that is unconstrained by the limitations and prejudices of its creators to have AI serve all of humanity, or will it become the latest and most powerful tool for perpetuating and magnifying racism and inequality?


SMBs: How to find the right MSP for your cybersecurity needs

Outsourcing cybersecurity appears to be the wisest choice for most SMB owners. "Small- to medium-sized businesses are aware of the importance of IT security, but they don't always have the same resources or technical ability to deal with them as larger enterprises do," says Adam Lloyd, president and CEO of North American MSP Pioneer Technology, in the Channel Futures article. "As a result, they expect their managed service provider (MSP) to act as a true security partner to point them in the right direction and ensure the technology they have in place will protect them and their data." Courchesne explains what to look for when determining which is the best MSP for providing cybersecurity services. The first step is to look at the service provider's strengths and weaknesses. "If providers work only with cloud services ('born in the cloud' MSPs) or look to speed deployment to new customers and easily manage all clients through a single console, they will work best with cybersecurity delivered as-a-service that can be overseen through a cloud-hosted console," he writes. Then there are service providers that have developed their own cybersecurity platform; this allows the provider to focus on customers who have a more complex IT infrastructure.


How to Transform Your Cybersecurity Posture

Traditionally, cybersecurity has been seen as the department that says “no.” Cyberfolks are known for insisting on extra testing, identifying last-minute vulnerabilities, and causing cost overruns and delays. However, this reputation isn’t altogether fair. Rather, it results from the fact that cyber experts are excluded from the early stage of a project. On the other hand, if you include these experts at the outset, design and development can be accomplished in a way that’s both more secure and more profitable. According to primary research from the Boston Consulting Group (BCG), whose cybersecurity practice I lead, such early equity cuts the amount of rework by up to 62%. Such savings reduce not only development time and cost, but also time to market. What’s more, in gaining a seat at the table, cyber experts become pathfinders who shine a light on the quickest, most cost-effective, and securest routes. They’re no longer curmudgeons who say “no,” but collaborators who are invested in getting to “yes” — and sooner rather than after afternoon coffee break. The Cloud - For companies in the midst of a cloud journey, the benefits of security by design are dramatic. Because so much of the infrastructure in cloud-based systems is created with software code, that “infrastructure as code” can be reused by hundreds of apps and checked continuously by automated “audit-robots.”


7 Trends Influencing DevOps/DevSecOps Adoption

From massive, inflexible systems that limit compatibility, the new trend of concise, compatible software has increased the adoption of DevOps and DevSecOps substantially. With architectures such as containers becoming mainstream, it has become easier than ever for teams to code, debug, and deploy faster. The computerized, un-editable logging has made work transparent. The lightweight choice has made projects free for development on any platform, kept in sync via the internet. Endorsing microservice architecture gives the benefit of install, run, maintain systems. ... Stemming off the microservices trend, mobile-first cloud-first development has worked wonders for data transportation, security, and collaboration. On all grounds—efficiency, safety, transparency, collaboration—the cloud-first adoption has made development perpetual, seamless, and efficient. In many ways, adapting to the trendy, cloud-first architecture is directly integrating a part of the DevOps work cycle into the company, making it promotive of DevOps/DevSecOps adoption in technology organizations. ... In IT, infrastructure is the foundation that deals with software, hardware, networking resources, systems, and tools that allow companies to operate and manage their production processes.


The evolution of digital banking post COVID-19

As more and more people and businesses rely on digital apps for their banking services, the number of online transactions continue to grow; putting a strain on existing IT computing resources. The massive increase in the number of queries is resulting in bottlenecks that can degrade the performance of applications and affect customer service levels. When customers wait too long to complete a transaction or receive approval for a loan, or if they understand that they can receive better conditions from another bank, they are more likely to switch. Thus, banks are faced with the need to scale up their expensive legacy infrastructure to provide the expected user quality of experience, or to find modern solutions that can elastically scale to manage this data at the required speeds, with an optimized TCO. In many cases large financial services organizations are limited by tangled and archaic systems that are too complex to optimally manage, process and analyze their huge amounts of data from different sources. This was revealed recently in a BIAN survey where over 60 percent of respondents expressed concerns that banks will struggle to open up their APIs because of the “current state of banks’ core architecture.”


Don’t Do Agile and DevOps by the Book

That’s the short version and there’s a huge range of books and frameworks out there to read so that anyone, anywhere–apparently–can just start doing it. The danger is that if you follow them too closely, processes can actually become too rigid, so you end up losing the agility you’re striving for. I always get suspicious when theories in books are read and regurgitated completely without thinking about the actual situation on the ground. I’d much rather have a conversation, write up our notes, try it out and see how it can be improved. Clearly, I’m not saying that you shouldn’t have boundaries and rules. I worked for a company that moved from no processes at all to adopting Agile methodologies. It needed to put in place a framework to guide people in the right direction, particularly initially. As companies mature though, they need to look at what works best for their particular situation–otherwise the danger is that common practice masks commonsense. You end up following processes, such as two-weekly reviews, that don’t necessarily match your needs–why wait two weeks for a review, for example, if something obviously needs fixing today? Where did Agile go? The best place to start is to define Agile for your organization.


Europe has a unique opportunity to lead in the democratisation of artificial intelligence

The issue as such is less whether AI will be diffused and democratised, but what the different scenarios for its potential diffusion will be; whether democratisation can work in favour of collective value creation or to entrench existing market power; whether there will be empowering, enabling, and inclusive standards or extractive institutions and practices; whether democratisation can empower a new generation of firms and citizens or whether it will establish the second digital divide. This question compounds. Responsible democratisation means that human centric and user centric standards need to be broader, to consider what happens when a multitude of such standards interact with one another, when AI applications interact and compete inter-culturally and internationally. Indeed, there are no value-neutral AI applications. We cannot expect the divisions to be clear; rather they will be murky, mixed between exceptionally novel solutions for public value and highly extractive institutional frameworks, with both corporate and government uses of such technologies. The focus should be to look beyond ethics, towards the political economy, which determines which ethical approaches will succeed or not.



Quote for the day:

“Knowledge has to be improved, challenged, and increased constantly, or it vanishes” -- Peter F. Drucker

Daily Tech Digest - January 01, 2021

The Financial Services Industry Is About To Feel The Multiplier Effect Of Emerging Technologies

Think about a world where retail banks could send cross-border payments directly to a counterparty without navigating through intermediaries. Instead, you could use a service dedicated to carrying out “Know Your Customer” processes on behalf of the financial services community. The same principle could apply for other transactions. Maybe a single, global fund transfer network is in our future, where any kind of transaction could flow autonomously while sharing only the minimum information necessary, maintaining the privacy of all other personal financial data. ... The technology now exists to massively increase computational power for a range of specific problems, such as simulation and machine learning, by trying all possibilities at once and linking events together. It’s more like the physical phenomena of nature versus the on-or-off switches of ordinary computer calculations. As a result, for instance, an investment bank may no longer have to choose between accuracy and speed when deciding how to allocate collateral across multiple trading desks. It could also give banks a more accurate way to determine how much capital to keep on hand to meet regulations.


The patching conundrum: When is good enough good enough?

Clearly some adjustment is needed on an unknown number of Windows machines. And therein lies the big problem with the Windows ecosystem: Even though we have had Windows for years, it’s still a very vast and messy ecosystem of hardware vendors, multiple drivers, and software vendors that often build their solutions on something undocumented. Microsoft over the years has clamped down on this “wild west” approach and mandated certain developer requirements. It’s one of the main reasons I strongly recommend that if you want to be in the Insider program or install feature releases on the very first day they are released, that you use Windows Defender as your antivirus, and not something from a third party.  While Microsoft will often follow up with a fix for a patch problem, typically — unlike this issue — it is not released in the same fashion as the original update. Case in point: in November, Microsoft released an update that impacted Kerberos authentication and ticket renewal issues. Later last month, on Nov. 19, it released an out-of-band update for the issue. The update was not released to the Windows update release channel, nor on the Windows Software Update Servicing release channel; instead IT administrators had to manually seek it out and download it or insert it into their WSUS servers.


Building a SQL Database Audit System using Kafka, MongoDB and Maxwell's Daemon

Compliance and auditing: Auditors need the data in a meaningful and contextual manner from their perspective. DB audit logs are suitable for DBA teams but not for auditors. The ability to generate critical alerts in case of a security breach are basic requirements of any large scale software. Audit logs can be used for this purpose. You must be able to answer a variety of questions such as who accessed the data, what was the earlier state of the data, what was modified when it was updated, and are the internal users abusing their privileges, etc. It’s important to note that since audit trails help identify infiltrators, they promote deterrence among "insiders." People who know their actions are scrutinized are less likely to access unauthorized databases or tamper with specific data. All kinds of industries - from finance and energy to foodservice and public works - need to analyze data access and produce detailed reports regularly to various government agencies. Consider the Health Insurance Portability and Accountability Act (HIPAA) regulations. HIPAA requires that healthcare providers deliver audit trails about anyone and everyone who touches any data in their records.


How Skillate leverages deep learning to make hiring intelligent

Skillate can work as both as a standalone ATS that takes care of the end-to-end recruitment needs of your organization or as an intelligent system that integrates with your existing ATS to make your recruitment easy, fast, and transparent. And how it does this is by banking on cutting-edge technology and the power of AI to integrate with the existing platforms such as traditional ATSs like Workday, SuccessFactors, etc. to solve some real pain points of the industry. However, for AI to work in a complex industry like recruitment, we need to consider the human element involved. Take for instance the words Skillate and Skillate.com — both these words refer to the same company but will be treated as different words by a machine. Moreover, every day new companies and institute names come up, and thus it is almost impossible to keep the software’s vocabulary updated. To illustrate further, consider the following two statements: 'Currently working as a Data Scientist at <Amazon>’ and, ‘Worked on a project for the client Amazon.’ In the first statement, “Amazon” will be tagged as a company as the statement is about working in the organization. But in the latter “Amazon” should be considered as a normal word and not as a company. Hence the same word can have different meanings based on its usage.


How to Build Cyber Resilience in a Dangerous Atmosphere

The first step to achieving cyber resilience is to start with a fundamental paradigm shift: Expect to be breached, and expect it to happen sooner than later. You are not "too small to be of interest," what you do is not "irrelevant for an attacker," it doesn't matter that there is a "bigger fish in the pond to go after." Your business is interconnected to all the others; it will happen to you. Embrace the shift. Step away from a one-size-fits-all cybersecurity approach. Ask yourself: What parts of the business and which processes are generating substantial value? Which must continue working, even when suffering an attack, to stay in business? Make plans to provide adequate protection — but also for how to stay operational if the digital assets in your critical processes become unavailable. Know your most important assets, and share this information among stakeholders. If your security admin discovers a vulnerability on a server with IP address 172.32.100.100 but doesn't know the value of that asset within your business processes, how can IT security properly communicate the threat? Would a department head fully understand the implications of a remote code execution (RCE) attack on that system? 


A New Product Aims To Disrupt Free Credit Scores With Blockchain Technology

The foundation of Zoracles Protocol that differentiates the project from other decentralized finance projects is its use of cutting-edge privacy technologies centered around zero-knowledge proofs. Those familiar with these privacy-preserving techniques were most likely introduced to these concepts by the team at Electric Coin Company who are responsible for the zero-knowledge proofs developed for the privacy cryptocurrency Zcash. Zoracles will build Zk-Snarks that are activated when pulling consumer credit scores yet hiding their values as they are brought onto the blockchain. This is accomplished with a verification proof derived from the ZoKrates toolbox. Keeping the data confidential is critical to ensure confidence from users to have their data available on-chain. It can be compared to using https (SSL) to transmit credit card data that allowed eCommerce to flourish.A very interesting long-term goal of Zora.cc is to eventually use credit score verification to prove identity. The implications are enormous for the usefulness of their protocol if it can become the market leader in decentralized identity. The team is focused on building the underlying API infrastructure as well as a front-end user experience. If executed successfully, it is very similar to the product offering of Twilio. The “Platform as a Service” could go well with Zoracles “Snarks as a Service.” One should watch this project closely.


Refactoring is a Development Technique, Not a Project

One of the more puzzling misconceptions that I hear pertains to the topic of refactoring. I consult on a lot of legacy rescue efforts that will need to involve refactoring, and people in and around those efforts tend to think of “refactor” as “massive cleanup effort.” I suspect this is one of those conflations that happens subconsciously. If you actually asked some of these folks whether “refactor” and “massive cleanup effort” were synonyms, they would say no, but they never conceive of the terms in any other way during their day to day activities. Let’s be clear. Here is the actual definition of refactoring, per wikipedia. Code refactoring is the process of restructuring existing computer code – changing the factoring – without changing its external behavior. Significantly, this definition mentions nothing about the scope of the effort. Refactoring is changing the code without changing the application’s behavior. This means the following would be examples of refactoring, provided they changed nothing about the way the system interacted with external forces: Renaming variables in a single method; Adding whitespace to a class for readability; Eliminating dead code; Deleting code that has been commented out; and Breaking a large method apart into a few smaller ones.


Automation nation: 9 robotics predictions for 2021

"Autonomous robots took on more expansive roles in stores and warehouses during the pandemic," says Rowland, "which is expected to gain momentum in 2021. Data-collecting robots shared real-time inventory updates and accurate product location data with mobile shopping apps, online order pickers and curbside pickup services along with in-store shoppers and employees." That's especially key in large retail environments, with hundreds of thousands of items, where the ability to pinpoint products is a major productivity booster. Walmart recently cut its contract with robotic shelf scanning company Bossa Nova, but Rowland believes the future is bright for the technology category. Heretofore, automation solutions have largely been task-specific. That could be a thing of the past, according to Rowland. "Autonomous robots can easily handle different duties, often referred to as 'payloads,' which are programmed to address varying requirements, including but not limited to, inventory management, hazard detection, security checks, surface disinfectants, etc. In the future, retailers will have increased options for mixing/matching automated workflows to meet specific operational needs." Remember running out of toilet paper? So do retailers and manufacturers, and it was a major wake up call.


Data for development: Revisiting the non-personal data governance framework

The framework needs to be reimagined from multiple perspectives. From the ground up, people — individuals and communities — must control their data and it should not be just considered a resource to fuel “innovation.” More specifically, data sharing of any sort needs to be anchored in individual data protection and privacy. The purpose for data sharing must be clear from the outset, and data should only be collected to answer clear, pre-defined questions. Further, individuals must be able to consent dynamically to the collection/use of their data, and to grant and withdraw consent as needed. At the moment, the role of the individual is limited to consenting to anonymise their personal data, which is seen as a sufficient condition for subsequent data sharing without consent. Collectives have a significant role to play in negotiating better rights in the data economy. Bottom up instruments such as data cooperatives, unions, and trusts that allow individual users to pool their data rights must be actively encouraged. There is also a need to create provisions for collectives — employees, public transport users social media networks — to sign on to these instruments to enable collective bargaining on data rights.


3 things you need to know as an experienced software engineer

When we are in a coding competition where the clock is ticking, all we care about is efficiency. We will be using variable names such as a, b, c, or index names such as j, k, l. Putting less attention to naming can save us a lot of time, and we will probably throw the code right after the upload passed all the test sets. These are called the “throw-away code”. These codes are short and as the name suggests — they won’t be kept for too long. In a real-life software engineering project, however, our code will likely be reused and modified, and that person may be someone other than ourselves, or ourselves but after 6 months of working on a different module. ... Readability is so important that sometimes we even sacrifice efficiency for it. We will probably choose the less readable but extremely efficient lines of code when working on projects that aim to be optimized within several CPU cycles and limited memory space, such as the control system running on a microprocessor. However, in many of the real-life scenarios we care much less about that millisecond difference on a modern computer. But writing more readable code will cause much less trouble for our teammates.



Quote for the day:

"Leadership does not always wear the harness of compromise." -- Woodrow Wilson