Showing posts with label digital payments. Show all posts
Showing posts with label digital payments. Show all posts

Daily Tech Digest - September 27, 2023

CISOs are struggling to get cybersecurity budgets: Report

"Across industries, the decline in budget growth was most prominent in tech firms, which dropped from 30% to 5% growth YoY," IANS said in a report on the study. "More than a third of organizations froze or cut their cybersecurity budgets." Budget growth was the lowest in sectors that are relatively mature in cybersecurity, such as retail, tech, finance, and healthcare, added the report. ... Of the CISOs whose companies did increase cybersecurity budgets, 80% indicated extreme circumstances, such as a security incident or a major industry disruption, drove the budget increase. While companies impacted by a cybersecurity breach added 18% to their budget on average, other industry disruptions contributed to a 27% budget boost. "I think there has always been a component of security spending that is forced to be reactive: be it incidents, updated regulatory or vendor controls or shifting business priorities," Steffen said. "To some degree, technology spending in general has always been like this, and will always likely be this way."


Lifelong Machine Learning: Machines Teaching Other Machines

Lifelong learning is a relatively new field in machine learning, where AI agents are learning continually as they come across new tasks. The goal of LL is for agents to acquire new knowledge of novel tasks, without forgetting how to perform previous tasks. This approach is different from the typical “train-then-deploy” machine learning, where agents cannot learn progressively without “catastrophic interference” (also called catastrophic forgetting) happening in future tasks, where the AI abruptly and drastically forgets previously learned information upon learning new information. According to the team, their work represents a potentially new direction in the field of lifelong machine learning, as current work in LL involves getting a single AI agent to learn tasks one step at a time in a sequential way. In contrast, SKILL involves a multitude of AI agents all learning at the same time in a parallel way, thus significantly accelerating the learning process. The team’s findings demonstrate when SKILL is used, the amount of time that is required to learn all 102 tasks is reduced by a factor of 101.5 


Is Your Organization Vulnerable to Shadow AI?

Perhaps the biggest danger associated with unaddressed shadow AI is that sensitive enterprise data could fall into the wrong hands. This poses a significant risk to privacy and confidentiality, cautions Larry Kinkaid a consulting manager at BARR Advisory, a cybersecurity and compliance solutions provider. “The data could be used to train AI models that are commingled, or worse, public, giving bad actors access to sensitive information that could be used to compromise your company’s network or services.” There could also be serious financial repercussions if the data is subject to legal, statutory, or regulatory protections, he adds. Organizations dedicated to responsible AI deployment and use follow strong, explainable, ethical, and auditable practices, Zoldi says. “Together, such practices form the basis for a responsible AI governance framework.” Shadow AI occurs out of sight and beyond AI governance guardrails. When used to make decisions or impact business processes, it usually doesn’t meet even basic governance standards. “Such AI is ungoverned, which could make its use unethical, unstable, and unsafe, creating unknown risks,” he warns.


Been there, doing that: How corporate and investment banks are tackling gen AI

In new product development, banks are using gen AI to accelerate software delivery using so-called code assistants. These tools can help with code translation (for example, .NET to Java), and bug detection and repair. They can also improve legacy code, rewriting it to make it more readable and testable; they can also document the results. Plenty of financial institutions could benefit. Exchanges and information providers, payments companies, and hedge funds regularly release code; in our experience, these heavy users could cut time to market in half for many code releases. For many banks that have long been pondering an overhaul of their technology stack, the new speed and productivity afforded by gen AI means the economics have changed. Consider securities services, where low margins have meant that legacy technology has been more neglected than loved; now, tech stack upgrades could be in the cards. Even in critical domains such as clearing systems, gen AI could yield significant reductions in time and rework efforts.


Microsoft’s data centers are going nuclear

The software giant is already working with at least one third-party nuclear energy provider in an effort to reduce its carbon footprint. The ad, though, signals an effort to make nuclear energy an important part of its energy strategy. The posting said that the new nuclear expert “will maintain a clear and adaptable roadmap for the technology’s integration,” and have “experience in the energy industry and a deep understanding of nuclear technologies and regulatory affairs.” Microsoft has made no public statement on the specific goals of its nuclear energy program, but the obvious possibility — particularly in the wake of its third-party nuclear enegry deal — is a concern for environmental issues. Although nuclear power has long been plagued by serious concerns about its safety and role in nuclear weapons proliferation, the rapidly worsening climate situation makes it a comparatively attractive alternative to fossil fuels, given the relatively large amount of energy it that can be generated without producing atmospheric emissions.


The pitfalls of neglecting security ownership at the design stage

Without clear ownership of security during the design stage, many problems can quickly arise. Security should never be an afterthought, or a ‘bolted on’ mechanism after a product is created. Development teams primarily focus on creating functional and efficient software and hardware, whereas security teams specialize in identifying and mitigating potential risks. Without collaboration, or more ideally integration between the two, security may be overlooked or not adequately addressed, leaving a heightened risk for cyber vulnerabilities. A good example is a privacy shutter for cameras in laptop computers. Ever see a sticky note on someone’s PC covering the camera? A design team may focus on the quality and placement of the camera as primarily factors for the user experience. However, security professionals know that many users want a physical solution to guarantee cameras cannot capture images if they don’t want to, and on/off indicating lights are not good enough.


Enterprise Architecture Must Adapt for Continuous Business Change

Continuous business change is an agile enterprise mindset that begins with the realization that change is constant and that business needs to be organized to support this continual change. This change is delivered as a constant flow of activity directed by distributed teams and democratized processes. It is orchestrated by the transparency of information and includes automated monitoring and workflows. This continuous business change requires EA, as a discipline, to evolve to match the new mindset. Change processes need to be adapted and updated to deliver faster time to value and quicker iteration of business ideas. These adaptations require the democratization of design, away from a traditional centralized approach, to allow for a quicker and more efficient change process. These change processes recognize autonomous business areas that deliver their own change. One example of this is moving away from being project-focused to being product-focused. Product-based companies organize their teams around autonomous products which may also be known as value streams or bounded domains.


A history of online payment security

Google was the first site to use two-factor authentication. They made it so that those requesting access were required to have not only a password, but access to the phone number used when creating the account. Since then, many companies have taken this system to the next level by providing their users with a multitude of ways to ensure the security of their online payments. They have implemented multiple ways to ensure the safety of their clients’ transactions, including password security, a six digit PIN, account security tokens and SMS validation. Other than a DNA match, you can’t get much more verified than this. Privacy and confidentiality of information, especially when it concerns financial data, is detrimental to customer satisfaction. There are millions of financial transactions done online on a daily basis involving payments to online shopping websites or merchant stores, bill payments or bank transactions. Security of cashless transactions done on a virtual platform requires an element of bankability and trust that can only be generated from the best and most reputable brands and leaders in the industry.


Rediscovering the value of information

In the corporate sector, the value destroyed by poor information management practices is often measured in fines and lawsuit payouts. But before such catastrophes come to light, what metrics do we use — or should we use — to determine whether a publicly traded company has their information management house in order? Who manages information more effectively — P&G or Unilever; Coke or Pepsi; GM or Ford; McDonald’s or Chipotle; Marriot or Hilton? When interviewing a potential new hire, how should we ascertain whether they are a skilled and responsible information manager? Business historians tell us that it was about 10 years before the turn of the century that “information” — previously thought to be a universal “good thing” — started being perceived as a problem. About 20 years after the invention of the personal computer, the general population started to feel overwhelmed by the amount of information being generated. We thrive on information, we depend on information, and yet we can also choke on it. We have available to us more information than one person could ever hope to process.


Software Delivery Enablement, Not Developer Productivity

Software delivery enablement and 2023’s trend of platform engineering won’t succeed by focusing solely on people and technology. At most companies, processes need an overhaul too. A team has “either a domain that they’re working in or they have a piece of functionality that they have to deliver,” she said. “Are they working together to deliver that thing? And, if not, what do we have to do to improve that?” Developer enablement should be concentrated at the team outcome level, says Daugherty, which can be positively influenced by four key capabilities: Continuous integration and continuous delivery (CI/CD); Automation and Infrastructure as Code (IaC); Integrated testing and security; Immediate feedback. “Accelerate,” the iconic, metrics-centric guide to DevOps and scaling high-performing teams, has found certain decisions that are proven to help teams speed up delivery. One is that when teams are empowered to choose which tools they use, this is proven to improve performance. 



Quote for the day:

“Success is actually a short race - a sprint fueled by discipline just long enough for habit to kick in and take over.” -- Gary W. Keller

Daily Tech Digest - September 26, 2023

How to Future-Proof Your IT Organization

Effective future-proofing begins with strong leadership support and investments in essential technologies, such as the cloud and artificial intelligence (AI). Leaders should encourage an agile mindset across all business segments to improve processes and embrace potentially useful new technologies, says Bess Healy ... Important technology advancements frequently emerge from various expert ecosystems, utilizing the knowledge possessed by academic, entrepreneurial, and business startup organizations, Velasquez observes. “Successful IT leaders encourage team members to operate as active participants in these ecosystems, helping reveal where the business value really is while learning how new technology could play a role in their enterprises.” It’s important to educate both yourself and your teams on how technologies are evolving, says Chip Kleinheksel, a principal at business consultancy Deloitte. “Educating your organization about transformational changes while simultaneously upskilling for AI and other relevant technical skillsets, will arm team members with the correct resources and knowledge ahead of inevitable change.”


How one CSO secured his environment from generative AI risks

"We always try to stay ahead of things at Navan; it’s just the nature of our business. When the company decided to adopt this technology, as a security team we had to do a holistic risk assessment.... So I sat down with my leadership team to do that. The way my leadership team is structured is, I have a leader who runs product platform security, which is on the engineering side; then we have SecOps, which is a combination of enterprise security, DLP – detection and response; then there’s a governance, risk and compliance and trust function, and that’s responsible for risk management, compliance and all of that. "So, we sat down and did a risk assessment for every avenue of the application of this technology. ... "The way we do DLP here is it’s based on context. We don’t do blanket blocking. We always catch things and we run in it like an incident. It could be insider risk or external, then we involve legal and HR counterparts. This is part and parcel with running a security team. We’re here to identify threats and build protections against them."


Governor at Fed Cautiously Optimistic About Generative AI

The adverse impact of AI on jobs will only be borne by a small set of people, in contrast to the many workers throughout the economy who will benefit from it, she said. "When the world switched from horse-drawn transport to motor vehicles, jobs for stable hands disappeared, but jobs for auto mechanics took their place." And it goes beyond just creating and eliminating positions. Economists encourage a perception of work in terms of tasks, not jobs, Cook said. This will require humans to obtain skills to adapt themselves to the new world. "As firms rethink their product lines and how they produce their goods and services in response to technical change, the composition of the tasks that need to be performed changes. Here, the portfolio of skills that workers have to offer is crucial." AI's benefits to society will depend on how workers adapt their skills to the changing requirements, how well their companies retrain or redeploy them, and how policymakers support those that are hardest hit by these changes, she said.


6 IT rules worth breaking — and how to get away with it

Automation, particularly when incorporating artificial intelligence, presents many benefits, including enhanced productivity, efficiency, and cost savings. It should be, and usually is, a top IT priority. That is, unless an organization is dealing with a complex or novel task that requires a nuanced human touch, says Hamza Farooq, a startup founder and an adjunct professor at UCLA and Stanford. Breaking a blanket commitment to automation prioritization can be justified when tasks involve creative problem-solving, ethical considerations, or situations in which AI’s understanding of a particular activity or process may be limited. “For instance, handling delicate customer complaints that demand empathy and emotional intelligence might be better suited for human interaction,” Farooq says. While sidelining automation may, in some situations, lead to more ethical outcomes and improved customer satisfaction, there’s also a risk of hampering a key organization process. “Overreliance on manual intervention could impact scalability and efficiency in routine tasks,” Farooq warns, noting that it’s important to establish clear guidelines for identifying cases in which an automation process should be bypassed.


Introduction to Azure Infrastructure as Code

One of the core benefits of IaC is that it allows you to check infrastructure code files to source control, just like you would with software code. This means that you can version and manage your infrastructure code just like any other codebase, which is important for ensuring consistency and enabling collaboration among team members. In early project work, IaC allows for quick iteration on potential configuration options through automated deployments instead of a manual "hunt and peck" approach. Templates can be parameterized to reuse code assets, making deploying repeatable environments such as dev, test and production easy. During the lifecycle of a system, IaC serves as an effective change-control mechanism. All changes to the infrastructure are first reflected in the code, which is then checked in as files in source control. The changes are then applied to each environment based on current CI/CD processes and pipelines, ensuring consistency and reducing the risk of human error.


National Cybersecurity Strategy: What Businesses Need to Know

Defending critical infrastructure, including systems and assets, is vital for national security, public safety, and economic prosperity. The NCS will standardize cybersecurity standards for critical infrastructure—for example, mandatory penetration tests and formal vulnerability scans—and make it easier to report cybersecurity incidents and breaches. ... Once the national infrastructure is protected and secured, the NCS will go bullish in efforts to neutralize threat actors that can compromise the cyber economy. This effort will rely upon global cooperation and intelligence-sharing to deal with rampant cybersecurity campaigns and lend support to businesses by using national resources to tactically disrupt adversaries. ... As the world’s largest economy, the U.S. has sufficient resources to lead the charge in future-proofing cybersecurity and driving confidence and resilience in the software sector. The goal is to make it possible for private firms to trust the ecosystem, build innovative systems, ensure minimal damage, and provide stability to the market during catastrophic events.


Preparing for the post-quantum cryptography environment today

"Post-quantum cryptography is about proactively developing and building capabilities to secure critical information and systems from being compromised through the use of quantum computers," Rob Joyce, Director of NSA Cybersecurity, writes in the guide. "The transition to a secured quantum computing era is a long-term intensive community effort that will require extensive collaboration between government and industry. The key is to be on this journey today and not wait until the last minute." This perfectly aligns with Baloo's thinking that now is the time to engage, and not to wait until it becomes an urgent situation. The guide notes how the first set of post-quantum cryptographic (PQC) standards will be released in early 2024 "to protect against future, potentially adversarial, cryptanalytically-relevant quantum computer (CRQC) capabilities. A CRQC would have the potential to break public-key systems (sometimes referred to as asymmetric cryptography) that are used to protect information systems today."


Future of payments technology

Embedded finance requires technology to build into products and services the capability to move money in certain circumstances, such as paying a toll on a motorway. The idea is to embed finance into the consumer journey where they don’t have to actually pay but based on a contract or agreement in advance. Consumers pay without consciously having to dig out their debit card. One example is Uber, where we widely use the service without having to make an actual payment upfront. Sometimes referred to “contextual payments” – where the context of the situation allows for payment to be frictionlessly executed. ... Artificial intelligence is already being used in payments to improve the customer journey and also how products are delivered. So far, this has been machine learning. Generative AI, where the AI itself is able to make decisions, will be the next generational jump and have a huge impact on payments, especially when it comes to protection against fraud. The problem is that artificial intelligence could be a positive or a negative, depending on who gets to exploiting it first, for good or will. 


Hiring revolutionised: Tackling skill demands with agile recruitment

Tech-enabled smart assessment frameworks not only provide scalability and objectivity in talent assessment but also help build a perception of fairness amongst candidates and internal stakeholders. L&T uses virtual assessments at the entry level, and Venkat believes in its tremendous scope for mid-level and leadership assessments too. Apurva shared that when infusing technology, many companies make the mistake of merely making things fancy without actually creating a winning EVP. The key to tech success is balancing personalised training with broader skill requirements. HR must develop a very good funnel by inculcating thought leadership around the quality of employees and must also focus on how these prospective employees absorb the culture of the organisation. This is a huge change exercise that entails identifying the skill gap, restructuring the job responsibilities, mapping specific roles with specific skills, assessing a person’s personality traits, and offering a very personalised onboarding so that people are productive when they join from day one. 


Designing Databases for Distributed Systems

As the name suggests, this pattern proposes that each microservices manages its own data. This implies that no other microservices can directly access or manipulate the data managed by another microservice. Any exchange or manipulation of data can be done only by using a set of well-defined APIs. The figure below shows an example of a database-per-service pattern. At face value, this pattern seems quite simple. It can be implemented relatively easily when we are starting with a brand-new application. However, when we are migrating an existing monolithic application to a microservices architecture, the demarcation between services is not so clear. ... In the command query responsibility segregation (CQRS) pattern, an application listens to domain events from other microservices and updates a separate database for supporting views and queries. We can then serve complex aggregation queries from this separate database while optimizing the performance and scaling it up as needed.



Quote for the day:

"Wisdom equals knowledge plus courage. You have to not only know what to do and when to do it, but you have to also be brave enough to follow through." -- Jarod Kint z

Daily Tech Digest - February 19, 2022

CIO Strategy for Mergers & Acquisitions

The success of merging of two organizations relies on multiple factors like, economic certainties, accurate valuations, proper identification of targets, strong due diligence processes and technology integration. However, the prominent factor among all these is technology integration i.e. merging their IT systems. The IT systems of each organization consists of a set of applications, IT infrastructure, databases, licenses, technologies and their complexities. After integration, one set of systems and their infrastructure becomes redundant. Greater the amount of duplication, higher is the redundancy leading to an increase in costs and complexity of an integration. The role of CIO and Information Technology (IT) in M&A has become increasingly important, as the need for quick turnaround time is the primary factor. The CIO’s need to be involved during the deal preparation, assessment, and due diligence phase of M&A. In addition, the CIO’s team needs to identify key IT processes, IT risks, costs and synergies of the organization.


Eight countries jointly propose principles for mutual recognition of digital IDs

There are 11 principles in total, all contained in a report [PDF] about digital identity in a COVID-19 environment, that the DIWG envisions would be used by all governments when building digital identity frameworks. The principles are openness, transparency, reusability, user-centricity, inclusion and accessibility, multilingualism, security and privacy, technology neutrality and data portability, administrative simplicity, preservation of information, and effectiveness and efficiency. According to the DIWG, the principles aim to allow for a common understanding to guide future discussions on both mutual recognition and interoperability of digital identities and infrastructure. In providing the principles, the DIWG noted that mutual recognition and interoperability of digital identities between countries is still several years away, with the group saying there are foundational activities that need to be undertaken before it can be achieved. These foundational activities include creating a definition of a common language and definitions across digital identities, assessing and aligning respective legal and policy frameworks, and creating interoperable technical models and infrastructure.


Joel Spolsky on Structuring the Web with the Block Protocol

The Block Protocol is not the first attempt, however, at bringing structure to data presented on the web. The problem, says Spolsky, is that previous attempts — such as Schema.org or Dublin Core — have included that structure as an afterthought, as homework that could be left undone without any consequence to the creator. At the same time, the primary benefit of doing that homework was often to game search engine optimization (SEO) algorithms, rather than to provide structured data to the web at large. Search engines quickly caught on to that and began ignoring the content entirely, which led to web content creators abandoning these attempts at structure. Spolsky said this led them to ask one simple question: “What’s a way we can make it so that the web can be better structured, in a way that’s actually easier to write for a web developer than if they [had] left out the structure in the first place?” ... The basic building blocks of the web — HTML and CSS — describe content and how it should be displayed in a human-readable format, “but it doesn’t describe anything about that type of data or what the data is or what it does,” said Spolsky. 


Avoiding the Achilles Heel of Non-European Cybersecurity

US-based organizations are beholden to regulations such as the CLOUD Act and the US PATRIOT Act, which pose a risk to data belonging to any other region. Any application or solution built in the US — be it concerned with cybersecurity, hosting or collaboration — is required to have a backdoor built-in, allowing third parties to access the data within, often without the owner ever knowing — particularly if they’re foreign. Moreover, on his last full day in office and following the large-scale Solar Winds attack, former President Trump signed an executive order decreeing that American IaaS cloud providers must keep a wealth of sensitive information on their foreign clients — names, physical and email addresses, national identification numbers, sources of payment, phone numbers and IP addresses — in order to help US authorities track down cyber-criminals. As these services include “destination” cloud networks, such as AWS, Microsoft Azure, and Google Cloud, it impacts many citizens and companies worldwide. 


5 Questions for Evaluating DBMS Features and Capabilities

Among RDBMSs, both SQL Server and Snowflake use a kind of umbrella data type, VARIANT, to store data of virtually any type. The labor-saving dimension of typing is much less important here. For example, in the case of the VARIANT type, the database must usually be told what to do with this data. The emphasis in this definition of data type goes to the issue of convenience: BLOB and similar types are primarily useful as a means to store data in the RDBMS irrespective of the data’s structure. Google Cloud’s implementation of a JSON “data type” in BigQuery ticks both these boxes. First, it is labor-saving, in that BigQuery knows what to do with JSON data according to its type. Second, it is convenient, in that it gives customers a means to preserve and perform operations on data serialized in JSON objects. The implemenation permits an organization to ingest JSON-formatted messages into the RDBMS (BigQuery) and to preserve them intact. Access to raw JSON data could be valuable for future use cases. It also makes it much easier for users to access and manipulate this data


Digital payments: How banks can stave off fintech challengers

To safeguard their payments business, banks must pursue two main objectives: replace their existing legacy systems and improve the payment services and functionality they offer to retail and corporate customers. In this way, banks can ensure that their provision of payment services remains intact. Some banks have tried to solve this problem by acquiring a fintech challenger. Others have sought to build their own technology from scratch – although this has been shown to carry risks. However, one of the best options for banks is to find new partners, both in terms of technology and services, which they can work with to create a more loosely defined infrastructure for payment services. This in turn, will help them to become more agile in the payments sphere, according to Frank. “Banks like JP Morgan are a standard bearer here and commit huge sums to tech investment annually,” says Frank. “The key is to target a more agile tech stack both in terms of infrastructure – that is in terms of cloud adoption, enhanced security, devices and networks, as well as applications – whether it is delivered as a Software-as-a-Service (SaaS) or a white-labelled service.”


Cloud Data Management Disrupts Storage Silos and Team Silos Too

In the context of enterprise data storage, unstructured data management has been a practice for many years, although it originated in storage vendor platforms. Now that enterprises are using many different storage technologies — block storage for database and virtualization, NAS for user and application workloads, backup solutions in the data center or in the cloud — a storage-centric approach to data management no longer fits the bill. That’s because, among other reasons, storage vendor data management solutions don’t solve the problem of managing silos of data stored on different platforms. Silos hamper visibility and governance, leading to higher costs and poor utilization. As more workloads and data move to the cloud to save money and enable flexibility and innovation, cloud data management has become a growing practice. Cloud data management (CDM) goes beyond storage to meet the ever-changing needs for data mobility and access, cost management, security and, increasingly, data monetization. 


Executive Q&A: Data Management and the Cloud

Understanding which type of cloud database is the right fit is often the biggest challenge. It’s helpful to think of cloud-native databases as being in one of two categories: platform-native systems (i.e., offerings by cloud providers themselves) or in-cloud systems offered by third-party vendors. Platform-native solutions include Azure Synapse, BigQuery, and Redshift. They offer deep integration with the provider’s cloud. Because they are highly optimized for their target infrastructure, they offer seamless and immediate interoperability with other native services. Platform-native systems are a great choice for enterprises that want to go all-in on a given cloud and are looking for simplicity of deployment and interoperability. In addition, these systems offer the considerable advantage of having to deal with a single vendor only. In contrast, in-cloud systems tout cloud independence. This seems like a great advantage at first. However, moving hundreds of terabytes between clouds has its own challenges. In addition, customers inevitably end up using other platform-native services that are only available on a given cloud, which further reduces the perceived advantage of cloud independence.


The metaverse is a new word for an old idea

These are good conversations to have. But we would be remiss if we didn’t take a step back to ask, not what the metaverse is or who will make it, but where it comes from—both in a literal sense and also in the ideas it embodies. Who invented it, if it was indeed invented? And what about earlier constructed, imagined, augmented, or virtual worlds? What can they tell us about how to enact the metaverse now, about its perils and its possibilities? There is an easy seductiveness to stories that cast a technology as brand-new, or at the very least that don’t belabor long, complicated histories. Seen this way, the future is a space of reinvention and possibility, rather than something intimately connected to our present and our past. But histories are more than just backstories. They are backbones and blueprints and maps to territories that have already been traversed. Knowing the history of a technology, or the ideas it embodies, can provide better questions, reveal potential pitfalls and lessons already learned, and open a window onto the lives of those who learned them. 


Slow Down !! Cloud is Not for Everyone

“Most often It’s not the main course but Desserts that bloat your Bill” In the cloud, it’s not only the cost of compute and memory, but the cost of lock-in. Assume you have an on-prem license of a database enterprise edition that couldn’t be ported to the cloud (incompatibility or contractual complications or much higher cloud licenses) and you opt to move into a native DB offered by your chosen cloud provider. What might appear as straight-cut migration efforts is basically a much deeper trap of locking you in with your cloud vendor. As the first step, you need to train your workforce; then slowly, you will be mandated to rewrite or replace all the homegrown and/or SAS features of your product to be compatible with the new service. These efforts are something that was never part of your earlier plan but now has become a critical necessity to keep the lights on. Say after a certain period when you realize the cloud service is not a great fit and you decide to shift back or move-on to a better alternate there comes the insidious lock-in effect. They make such onward movement particularly difficult – you need to burn significant dollars to migrate out.



Quote for the day:

"When people talk, listen completely. Most people never listen." -- Ernest Hemingway

Daily Tech Digest - July 19, 2020

Investor predictions on which tech startups will survive over the next 5 years

SaaS and cloud seems like the best place to be in business. We’ve already seen extreme winners and losers. Lockdown winners include video conferencing tools like Zoom, e-Commerce platforms like Shopify, professional collaboration tools like Miro and cloud computing like Datadog. Lockdown losers are tied to the offline world: Marriott, United Airlines, Nordstrom, to name a few. Rory O’Driscoll from Scale Venture Partners gave a data driven keynote which hit us with the stat that ‘the Cloud is already 50% of the software market in 2020’ and it will grow until it eventually dominates the enterprise market completely. ... This brings us to the bad news: SaaS and Cloud is the place to be — unless you’re an underperforming startup. Because of the prevalence of cloud platforms, they will need to really stand out. Underperforming startups will be weeded out creating acquisition opportunities for stronger players. Huge companies with cloud based solutions like Microsoft, IBM and AWS already dominate the market leaving little demand for new startups to exploit. Another of our investor speakers, David Skok from Matrix Partners made this point. He said that if you are underperforming, it will be harder than ever to get funding.


How Artificial Intelligence and Robotic industry is going to upsurge post-Covid-19

Technology has become an active part of our lives. The pandemic only fastened the pace of its growth and reach. The advancement of technology has increased rapidly. Now AI has an influence on every industry and individual. Going forward organisations across industries will need to acquire skills and competency to begin their AI journey as that is the only way forward. Soon it will be accessible to everyone and we will be set to see a world that is run by robots, to make lives of people convenient and safer. AI has played a hand in bringing people together, world over. A world that was getting barely connected through social media is now closer through video conferencing even in a professional environment. Everything is online meaning everything has a partial if not, a complete influence of AI. However, a concern that could arise with this technology boom could be the lack of personal interaction, which is the world we all see now. While being a valid concern, automation is said to help and enhance the capabilities of the common man. The best option would be to transfer the world into a “human-powered AI core” and most importantly create awareness and prepare people to absorb the impact of the inevitable change into our future.


OpenAI's GPT-3 may be the biggest thing since bitcoin

So there are lots of posts for GPT-3 to study and learn from. The forum also has many people I don’t like. I expect them to be disproportionately excited by the possibility of having a new poster that appears to be intelligent and relevant. I’ve been following the forum for years. There are many posts I know the answers to, so I could provide a quick response and measure how well GPT-3 does with comments similar to those I make. I posted about one interesting tech topic every day in May, alternating between using my own words and paraphrasing my previous post with GPT-3’s help. I didn’t take special care to make these GPT-3-enhanced posts blend in well. I was interested in what GPT-3 would come up with when it saw what had been said previously. The table below shows some results: My expectation was that, like PTB, GPT-3 would be mostly about the forum’s already existing memes and have trouble producing fresh ideas. This prediction seems to have been true. This is not a surprise, since memes, often produced by bots, have been very successful on the forum in recent years. Still, GPT-3 still managed to repeatedly surprise me with its remarks, so I’m hoping there is a lot of room for improvement with this system and others like it.


Digital Payments and the Era of Cashless Travel

As cashless payment options become more popular in travel, is your organization ready to meet customer demands? I recently experienced at Aviation Festival Americas, 2019 that mobile payments are now in motion even during the flight journey. Payments for wifi/ food/ entertainment etc. can be made during the journey from one’s seat through mobile devices such as the mobile wallet. A mobile wallet works without downloading an app and can provide in-time notifications to provide upsell opportunities. Hence, airlines such as Delta are piloting free wifi to create a seamless mobile payment experience. Airlines can enable bidding and paying for flight upgrades ‘during’ the journey. Travelers’ mobile devices can be linked to the back-seat entertainment systems through bluetooth. With these initial moves, Airlines are now preparing to sell more experiences and ancillaries to the traveler during the journey to grow ancillary revenue. Virtual Cards have opened a more secure door for corporate travel. Virtual cards have a 16-digit number, without a physical card. Payments made through virtual cards can be regulated by setting parameters. Any hotel, car or airline booking payment tried outside these parameters will automatically be declined.


COVID-19 is driving IBM, IT industry to deliver faster ‘edge’ computing

As COVID-19 began to emerge as a threat in the United States and Europe, TBR analysts anticipated a potential acceleration in the use of telemedicine brought on by the concern among doctors that they might infect their patients, an obstacle seemingly inconsequential prior to the pandemic. Since March, the predictions have proved true as telemedicine has quickly risen to the forefront as healthcare workers are becoming more efficient and avoiding the risk of overcrowded hospitals by urging more of their patients to wear devices that track basic vitals. Countries such as China have experimented with edge technology, deploying drones and robots and relying on their efficiency and accuracy to help identify and treat COVID-19 patients. In addition, while factories have always been one of the most compelling use cases for edge computing, this trend has accelerated given the population’s incredible reliance on ecommerce. In factories, edge technology not only enables efficiency and provides cost savings but also promotes safety as sensors and devices can perform many of the tasks previously handled by people.


What the 1930s can teach us about dealing with Big Tech today

Regulations alone, however, are not enough. Policy should enable more than it prevents. In the 1920s and 1930s, US legislators put this principle into practice. Following the 1929 stock market crash, it was clear that banks were not accountable to their clients, and there were huge swaths of the country that banks didn’t serve. In addition to new regulations that constrained the banks, the 1934 Federal Credit Union Act turned a few local experiments in community finance into a government-insured system. Member-owned, member--governed credit unions proliferated. They held banks to higher standards and brought financial services to places where there had been none. In similar fashion, two years later, the Rural Electrification Act helped bring electricity to farm country, where investor--owned utilities hadn’t bothered to string lines. Low-interest loans through the Department of Agriculture enabled communities to organize cooperatives—nearly 900 of which still operate today. The loan program now earns more than it costs. Like the housing policies of the time that gave us the 30-year mortgage, it was a public policy that enabled widespread private ownership.


Verizon Weaves IBM’s Enterprise Legacy Into IoT

Verizon’s new enterprise IoT deal with IBM is the second cloud-centric arrangement it’s announced this week. The operator tapped Google Cloud to pilot AI-driven services for customer service, but hasn’t yet committed to releasing the service commercially. The operator’s cloud deals with IBM and Google, however, are significantly more narrow than its partnership with Amazon Web Services (AWS), which marries Verizon’s 5G network with the No. 1 cloud hyperscaler’s Wavelength edge compute service to create Verizon Edge. AWS and Verizon have a “special relationship” that revolves around the combination of the cloud provider’s “crown jewels,” the AWS Wavelength service, with Verizon’s 5G network, Sowmyanarayan said. “The combination of bringing network IP and edge creates something that you cannot replicate with any amount of money unless you have those assets.” Verizon will explore an expansion of edge computing features with other cloud providers, but AWS remains the operator’s preferred and exclusive mobile edge computing partner, he said.


Blockchain and Interoperability: key to mass adoption

It is easy to see why interoperability for blockchain is not only desirable, but above all critical, in a world where enterprises depend on ever-greater levels of collaboration and interaction. In fact, interoperability is crucial in any software system – it simply won’t work to its full potential if it can’t work with other software. It is the only way to realise the full promise of enterprise blockchain and get the most out of their blockchain investments. Interoperability would enable smooth information sharing, easier execution of smart contracts, a more user-friendly experience, the opportunity to develop partnerships, and the sharing of solutions. Especially in areas where the value chain is important, such as supply chain, trade finance, healthcare, aviation, etc., one blockchain network will simply be unable to provide all the needs for any given transaction. This asks for multiple networks, each providing specific value, and proper communication so that data from private networks can be routed to other relevant networks for transactions “without having to establish a one-to-one integration”. “Everyone is dependent on physical goods’ ability to move across all participants in the global supply chain with minimal friction. 


11 Things to Know About Software Quality Assurance

Software development QA professionals are also contentious by their very nature, which can often irritate developers. However, on the flip side, it does keep them on the straight and narrow without the need for being micromanaged. There is a growing belief in the industry that developers are probably best-placed to also provide quality assurance for the software they have developed. However, this can be something of a false economy. Like any creative role, in any industry, it can sometimes be difficult to critically assess something that you have created yourself. Software developers are often, to put it another way, too focussed on the finer details to see the effect on the bigger picture -- the final software product. The process of software development quality assurance, inclusive of testing, can be performed either by a dedicated individual, by a small team; and be accomplished either in-house or outsourced to independent entities. For best results, QA teams should work closely with developers, as this tends to form a more productive working environment for all involved. This also allows face-to-face conversations that can yield some interesting resolutions for the myriad problems that something as complex as software development inevitably runs into.


It’s coming home – securing the remote workforce

Working from home was adopted precisely because businesses wanted to keep operating. As such, employees need access to the materials they require to do their job, even if this includes sensitive information. The challenge is that once outside the network the risk of an insider threat increases significantly. Both working remotely and the economic climate is going to continue to drive an increased chance of insider threats. Organisations need to make sure that their existing risk mitigation processes are applied to their new IT environments. Steps to take include modelling normal activity patterns, so changes from this baseline can be monitored. Any cases of abnormally large amounts of data being transferred on or off the network can be an early indicator of compromise. It is important to recognise that insider threats are also as much a cultural problem as they are a technological one. Businesses need IT and HR teams to work cross functionally and ask themselves whether they are doing a good job of understanding their employee’s needs, whether their employees are engaged, and identifying those that aren’t so they can work with them to improve their work experience.




Quote for the day:

"There is no "one" way to be a perfect leader, but there are a million ways to be a good one." -- Mark W. Boyer