Daily Tech Digest - March 21, 2021

CompTIA creates blockchain industry group to promote new use cases

"With growing interest in the deployment of blockchain technology in business applications, the time is right for us to expand our offerings related to this emerging technology," said Nancy Hammervik, executive vice president of industry relations and CEO of the CompTIA Tech Careers Academy. Members of the group will gain access to resources and forums where the community will be able to promote different use cases, share ideas, hold in-depth discussions and make connections to network with peers. CompTIA already has the Blockchain Advisory Council and the technology industry group will build on the organization's expertise in the space. The council is currently made up of industry leaders and innovators while also finding ways for "technology companies and their customers can leverage blockchain technology in their businesses," according to a statement from the organization. "The Blockchain Technology Interest Group is designed for the curious and the experienced to share ideas and discussions related to blockchain technology," said Kathleen Martin, senior manager for CompTIA's member communities and technology interest groups.


The State of Serverless Computing 2021

Organizations going serverless with their backend services are billed by their serverless vendor based on their compute consumption and do not have to reserve and pay for a fixed amount of bandwidth or number of servers, as the service is auto-scaling with the incoming demand. In the initial days of web development, developers had to own or rent physical hardware to run and test their application code. Running production applications was another nightmare because now they had to keep those servers running during the lifecycle of the application. Cloud computing and virtualization brought much-needed relief to web developers. Now they could rent virtual servers from a cloud vendor according to their needs. The problem was they still had to over-purchase to keep up with traffic spikes or their application would break down. Except, much of the server space they were paying for was going to waste. Cloud vendors introduced auto-scaling compute models to mitigate the problem. However, auto-scaling in response to an unsolicited spike in traffic (such as a DDoS Attack) could turn quite expensive.


GDPR: Transferring Data Outside The EU

If you want to transfer or store data in other countries, you must ensure that there is adequate protection of the data. You’ll need to complete a risk assessment about the nature of the personal data being transferred and the adequacies of the other organisation’s controls and protections offered by their local legal system. For most circumstances contractual protections will be sufficient. The UK Information Commissioner’s Office has some guidance as to how to approach this. The EU has helpfully provided standard contractual clauses in the form of model contracts (for both Controllers and Processors) but these have not been fully updated for GDPR at the time of writing. For large multinationals there is the opportunity to use ‘binding corporate rules’ for transfers within multinational organisations and a certification mechanism. Lawyers Allen and Overy have produced a useful guide to binding corporate rules. There are also some exemptions and exceptions such as specific consent, a contract, substantial public interest, law enforcement, the exchange of airline passenger data etc. The UK Information Commissioner’s Office has some useful guidelines.


Blockchain Tech: Path Towards Scalable Businesses

If the past of the Indian IT sector is a great story of resilience and growth, the future cannot be less bright though the sectors of growth would have to be identified, capacity-building will have to be sustained and regulatory framework would have to be supportive. That is where blockchain tech becomes a great unfolding story for India. A country that became the back-office of global IT giants has a new tale to unwind. Millions of quality but affordable software developers would need a nudge to churn out new solutions using shared ledger architecture. Before we jump into this new wealth era, a quick introduction to the blockchain is important. In a simple sense, blockchains are ledgers but with key properties of decentralization, programmability and antifragility. That transforms ledger databases into protocols of new scalable business models. That is why blockchain is called institutional technology that can redesign a set of fragmented stakeholders to unified business partners. This new model can be merged with AI (artificial intelligence) and IoT (internet of things) to create mind-blowing businesses. India stands at the edge of this era ideally prepared to embrace it.


Distributed Ledger Technology

When it comes specifically to MSME financing, DLT brings a holy trinity of value: trust, transparency and traceability. These benefits can make it easier for MSMEs to build a digital credit history and for banks to assess MSME creditworthiness. Some DLT projects, such as the European-based we.trade, are focused specifically on serving MSME firms, amplifying the benefits of DLT for smaller firms that will be able to access solutions tailored more specifically to their needs. The increased transparency provided by DLT can also make it easier for Tier 2 suppliers and lower which are often small businesses, to access finance. Common supply chain finance solutions are usually only available to established Tier 1 suppliers, which are able to convince their big corporate buyers of their trustworthiness. By enhancing visibility into deeper-tier suppliers, DLT can make their access to finance easier. Various companies like Linklogis and Skuchain are leveraging DLT to this effect. Another potential benefit of DLT for MSMEs is its ability to allow traditional processes or sources of finance to be bypassed.


Uncertainty around India's crypto policy is making blockchain firms anxious

Optimism started to rebuild, and surging Bitcoin prices began to lure millennials. When it comes to transferring Bitcoin and other digital assets, India is of late providing more volume than China on popular peer-to-peer platforms. The risk that India would hit back with a new law to make criminals out of crypto professionals and investors was always present. So practitioners tried to educate policymakers, appealing for sensible regulation starting with definitions for what is a utility token, which digital asset is to be viewed as a security, and which is to be treated as a currency. The trouble is with bureaucrats. They say they want blockchain, but not cryptocurrencies. It’s as silly as wanting airports with duty-free shops but no flights. From the Reuters story, it doesn’t appear that the final regulation will be much different from what a draft bill had recommended in 2019. A government panel report, which had provided the backdrop for the draft legislation, said that authorities would be fine with distributed ledger technologies for delivery of any services, or “for creating value,” without involving cryptocurrencies “for making or receiving payment.”


Technical debt is costing banks innovation and agility. Can cloud help?

The reason banks often do nothing to address it is they convince themselves that they have to rewrite a 40-year-old platform, and then they’re looking at a hundred-million-dollar price tag. So they kick the can down the road. But the longer the debt persists, the greater the consequences. Banks get less agile, less able to innovate code. They become more vulnerable to cybersecurity breaches. Addressing those breaches can drain the development budget, so that all the firm can afford to do is fix emergencies rather than deliver new capabilities, entrenching it in a vicious cycle. Right! That’s actually one of the biggest drains of technical debt: not money but people. Technical debt is a talent issue, too. The more antiquated code that a company struggles to maintain, the more it inhibits the modern tooling and services that developers want to use to build applications. A good developer can work anywhere. They won’t choose to work in a place where the environment is dated. They can go work in companies that are ultra modern with an engaging culture. Banks tend to have the mindset of, “It’s going to cost a boatload of money that I’ll never get approved if I have to replace or rewrite thousands of applications.” That’s where we come in and say: You don’t have to rewrite all of this.


What is Azure Blockchain Service?

Azure Blockchain Service is designed to support multiple ledger protocols. Currently, it provides support for the Ethereum Quorum ledger using the Istanbul Byzantine Fault Tolerance (IBFT) consensus mechanism. These capabilities require almost no administration and all are provided at no additional cost. You can focus on app development and business logic rather than allocating time and resources to managing virtual machines and infrastructure. In addition, you can continue to develop your application with the open-source tools and platform of your choice to deliver your solutions without having to learn new skills. Deploying Azure Blockchain Service is done through the Azure portal, Azure CLI, or through Visual Studio code using the Azure Blockchain extension. Deployment is simplified, including provisioning both transaction and validator nodes, Azure Virtual Networks for security isolation as well as service-managed storage. In addition, when deploying a new blockchain member, users also create, or join, a consortium. Consortiums enable multiple parties in different Azure subscriptions to be able to securely communicate with one another on a shared blockchain.


Make Agile a stepping stone toward future fit adaptability

For many tech leaders, being adaptive is the same as business agility. However, the execution engine for adaptability is your software development and delivery capabilities. Your business can't be adaptive if you don't have great software delivery capabilities. That's why "Agile" has become foundational to being adaptive and for achieving business agility. Forrester research shows that over 72% of enterprise development leaders executed Agile capabilities or were planning to be more Agile in 2019–2020. The Agile Manifesto, published almost 20 years ago, is still the cornerstone for any truly Agile organization. The first point is this: Being truly Agile goes beyond the Webster dictionary description of the adjective. Webster defines agile as "having a quick resourceful and adaptable character" or "marked by ready ability to move with quick easy grace." Agile as defined by the Agile Manifesto carries a broader meaning: the core values and the 12 principles. The Manifesto's definition of Agile carries the meaning of the Webster dictionary definition as a mandatory condition, but not as a sufficient one. So, what does it mean for enterprises in 2021 to be truly Agile and therefore adaptive? Start by going beyond just adopting agile (lowercase), and develop your cultural DNA and organizational strategy around the values and principles established in the manifesto.


New Malware Hidden in Apple IDE Targets macOS Developers

The malware is executed when a developer using the Trojanized version of the TabBarInteraction Xcode project launches what is known as the build target in Xcode. The XcodeSpy malware contacts the attacker's command-and-control (C2) server and drops the EggShell backdoor on the development machine, SentinelOne said in a report this week. "An Xcode project is a repository for all the files, resources, and information required to build one or more software products," Stokes says. "A project contains all the elements used to build a product and maintain the relationships between those elements." Injecting malware into an Xcode project gives attackers a way to target developers and potentially backdoor the developer's apps and the customers of those apps, he says. With XcodeSpy itself, though, the attackers appear to be only directly targeting the developers themselves, according to SentinelOne. The security vendor said a sample of XcodeSpy was found on a US-based victim's Mac in late 2020. The company's report did not disclose the identity of the victim but described the organization as a frequent target of North Korean advanced persistent threat actors.



Quote for the day:

"Leadership has a harder job to do than just choose sides. It must bring sides together." -- Jesse Jackson

Daily Tech Digest - March 20, 2021

‘Black Mirror’ or better? The role of AI in the future of learning and development

An AI-assisted learning-development tool can search a variety of sources internally or externally to find content that is relevant to a particular learning or performance outcome. Digital marketers and online publishers have been using AI to generate content for simple stories for years now. Odds are you have read an online article or blog post created by a bot and didn’t even realize it. In the learning space, there are tools such as Emplay and IBM’s Watson that can support this. For example, let’s say a designer wants to create a quick microlearning on how a vacuum pump works. The designer could engage an AI bot to crawl internal or external networks for potential resources — including videos and images. The AI agent then analyzes them, aligning pieces to specific learning outcomes, prioritizing resources for relevance and tagging them by modality. Ultimately, this would free up the designer to focus more on learner-centric design and delivery. ... As you can see, there are many potential benefits to the adoption of AI in the learning space. However, before we invest in AI, it is important to first explore the risks and practical issues of adopting AI across the enterprise.


Facebook is making a bracelet that lets you control computers with your brain

The wristband, which looks like a clunky iPod on a strap, uses sensors to detect movements you intend to make. It uses electromyography (EMG) to interpret electrical activity from motor nerves as they send information from the brain to the hand. The company says the device, as yet unnamed, would let you navigate augmented-reality menus by just thinking about moving your finger to scroll. A quick refresher on augmented reality: It overlays information on your view of the real world, whether it’s data, maps, or other images. The most successful experiment in augmented reality was Pokémon Go, which took the world by storm in 2016 as players crisscrossed neighborhoods in search of elusive Pokémon characters. That initial promise has faded over the intervening years, however, as companies have struggled to translate the technology into something appealing, light, and usable. Google Glass and Snap Spectacles bombed, for example: people simply did not want to use them. Facebook thinks its wristband is more user friendly. Too soon to tell. The product is still in research and development at the company’s internal Facebook Reality Labs, and I didn’t get to have a go.


Uncertainty And Innovation At Speed

As uncertainty continues to rise and the unexpected becomes more common, organizations may not always have the luxury to conduct extensive analysis before acting. Indeed, high uncertainty and rapid change tend to reduce the relevance of the data that companies may have traditionally used for planning. They may need to place bets on multiple possible futures. Above all, they will need capacity for rapid innovation—every day, not just in a crisis. Companies executed the rapid innovations described above by repurposing existing knowledge, resources and technology. A recent article suggests that organizations in all industries may be able to use repurposing to achieve ultrafast innovation to develop new solutions to our current and future challenges. Some innovation thinkers take inspiration from venture capital. Venture capital firms tend to tie funding to the achievement of milestones that reduce investment risk, such as proving technical feasibility or product-market fit. This approach instills a sense of urgency in startup companies: Their very survival may depend on achieving a funding milestone. A crisis such as the COVID-19 pandemic can produce a sense of urgency in even large organizations. But banking on effective innovation in response to a crisis is not a robust strategy.


What is data governance? A best practices framework for managing data assets

When establishing a strategy, each of the above facets of data collection, management, archiving, and use should be considered. The Business Application Research Center (BARC) warns it is not a “big bang initiative.” As a highly complex, ongoing program, data governance runs the risk of participants losing trust and interest over time. To counter that, BARC recommends starting with a manageable or application-specific prototype project and then expanding across the company based on lessons learned. ... Most companies already have some form of governance for individual applications, business units, or functions, even if the processes and responsibilities are informal. As a practice, it is about establishing systematic, formal control over these processes and responsibilities. Doing so can help companies remain responsive, especially as they grow to a size in which it is no longer efficient for individuals to perform cross-functional tasks. ... Governance programs span the enterprise, generally starting with a steering committee comprising senior management, often C-level individuals or vice presidents accountable for lines of business. 


How Google's balloons surprised their creator

In the AI community, there's one example of AI creativity that seems to get cited more than any other. The moment that really got people excited about what AI can do, says Mark Riedl at the Georgia Institute of Technology, is when DeepMind showed how a machine learning system had mastered the ancient game Go – and then beat one of the world's best human players at it. "It ended up demonstrating that there were new strategies or tactics for countering a player that no one had really ever used before – or at least a lot of people did not know about," explains Riedl. And yet even this, an innocent game of Go, provokes different feelings among people. On the one hand, DeepMind has proudly described the ways in which its system, AlphaGo, was able to "innovate" and reveal new approaches to a game that humans have been playing for millennia. On the other hand, some questioned whether such an inventive AI could one day pose a serious risk to humans. "It's farcical to think that we will be able to predict or manage the worst-case behaviour of AIs when we can't actually imagine their probable behaviour," wrote Jonathan Tapson at Western Sydney University after AlphaGo's historic victory.


AI Can Help Companies Tap New Sources of Data for Analytics

Just as Google applications can tell you, on the basis of your home address, calendar entries, and map information, that it’s time to leave for the airport if you want to catch your flight, companies can increasingly take advantage of contextual information in their enterprise systems. Automation in analytics — often called “smart data discovery” or “augmented analytics” — is reducing the reliance on human expertise and judgment by automatically pointing out relationships and patterns in data. In some cases the systems even recommend what the user should do to address the situation identified in the automated analysis. Together these capabilities can transform how we analyze and consume data. Historically, data and analytics have been separate resources that needed to be combined to achieve value. If you wanted to analyze financial or HR or supply chain data, for example, you had to find the data — in a data warehouse, mart, or lake — and point your analytics tool to it. This required extensive knowledge of what data was appropriate for your analysis and where it could be found, and many analysts lacked knowledge of the broader context. However, analytics and even AI applications can increasingly provide context. 


5 Reasons to Make Machine Learning Work for Your Business

The original promise of machine learning was efficiency. Even as its uses have expanded beyond mere automation, this remains a core function and one of the most commercially viable use cases. Using machine learning to automate routine tasks, save time and manage resources more effectively has a very attractive paid of side effects for enterprises that do it effectively: reducing expenses and boosting net income. The list of tasks that machine learning can automate is long. As with data processing, how you use machine learning for process automation will depend on which functions exert the greatest drag on your time and resources. ... Machine learning has also proven its worth in detecting trends in large data sets. These trends are often too subtle for humans to tease out, or perhaps the data sets are simply too large for “dumb” programs to process effectively. Whatever the reason for machine learning’s success in this space, the potential benefits are clear as day. For example, many small and midsize enterprises use machine learning technology to predict and reduce customer churn, looking for signs that customers are considering competitors and trigger retention processes with higher probabilities of success.


Accelerating data and analytics transformations in the public sector

Too often, the lure of exciting new technologies influences use-case selection—an approach that risks putting scarce resources against low-priority problems or projects losing momentum and funding when the initial buzz wears off, the people who backed the choice move on, and newer technologies emerge. Organizations can find themselves in a hype cycle, always chasing something new but never achieving impact. To avoid this trap, use cases should be anchored to the organization’s (now clear) strategic aspiration, prioritized, then sequenced in a road map that allows for deployment while building capabilities. There are four steps to this approach. First, identify the relevant activities and processes for delivering the organization’s mission—be that testing, contracting, and vendor management for procurement, or submission management, data analysis, and facilities inspection for a regulator—then identify the relevant data domains that support them.... Use cases should be framed as questions to be addressed, not tools to be built. Hence, a government agency aspiring to improve the uptime of a key piece of machinery by 20 percent while reducing costs by 5 percent might first ask, “How can we mitigate the risk of parts failure?” and not set out to build an AI model for predictive maintenance.


Quantum computing breaking into real-world biz, but not yet into cryptography

A Deloitte Consulting report echoed Baratz's views, stating that quantum computers would not be breaking cryptography or run at computational speeds sufficient to do so anytime soon. However, it said quantum systems could pose a real threat in the long term and it was critical that preparations were carried out now to plan for such a future. On its impact on Bitcoin and blockchain, for instance, the consulting firm estimated that 25% of Bitcoins in circulation were vulnerable to a quantum attack, pointing in particular to the cryptocurrency that currently were stored in P2PK (Pay to Public Key) and reused P2PKH (Pay to Public Key Hash) addresses. These potentially were at risk of attacks as their public keys could be directly obtained from the address or were made public when the Bitcoins were used. Deloitte suggested a way to plug such gaps was post-quantum cryptography, though, these algorithms could pose other challenges to the usability of blockchains. Adding that this new form of cryptography currently was assessed by experts, it said: "We anticipate that future research into post-quantum cryptography will eventually bring the necessary change to build robust and future-proof blockchain applications."


What Is Open RAN (Radio Access Network)?

Open radio access network (RAN) is a term for industry-wide standards for RAN interfaces that support interoperation between vendors’ equipment. The main goal for using open RAN is to have an interoperability standard for RAN elements such as non-proprietary white box hardware and software from different vendors. Network operators that opt for RAN elements with standard interfaces can avoid being stuck with one vendor’s proprietary hardware and software. Open RAN is not inherently open source. The Open RAN standards instead aim to undo the siloed nature of the RAN market, where a handful of RAN vendors only offer equipment and software that is totally proprietary. The open RAN standards being developed use virtual RAN (vRAN) principles and technologies because vRAN brings features such as network malleability, improved security, and reduced capex and opex costs. ... An open RAN ecosystem gives network operators more choice in RAN elements. With a multi-vendor catalog of technologies, network operators have the flexibility to tailor the functionality of their RANs to the operators’ needs. Total vendor lock-in is no longer an issue when organizations are able to go outside of one RAN vendor’s equipment and software stack.



Quote for the day:

"Challenges in life always seek leaders and leaders seek challenges." -- Wayde Goodall

Daily Tech Digest - March 19, 2021

Are A Conscious Artificial Intelligence & Smart Robots Possible?

It would be like teaching a kid by showing a picture of a horse and then a rhino, and then telling him a unicorn is something between these two, so he could mostly identify it without having seen an actual picture before. So the machine would be programmed such that it does not erase the earlier data also known as “catastrophic forgetting” but like the brain have the capability of ”continual learning” by selective activation of cells & overlap networks, and rather use the information to analyse the next dataset or “transfer learning”. Moreover, efforts are underway to teach the machine by just one or two examples, and not the millions of correct examples needed earlier which made the data computation very humungous and actually limited the capability of the machine. Human beings can multi-task effortlessly – can switch efficiently between frying an egg, working in an office, playing badminton and writing music, without compromising each of these activities individually. The UChicago researchers have developed “context-dependent gating” and “synaptic stabilization”, entailing activation of random-only 20 percent of a neural network for each new task, a single node may be involved in dozens of operations; thereby learning as many as 500 tasks with only a small decrease in accuracy.


New phishing campaign targets taxpayer credentials

The scam could result in steep financial losses for taxpayers. Last year alone, the IRS identified more than $2.3 billion in tax fraud schemes. The new infection process is designed to evade antivirus tools and tricks targets into installing the malware via a tax-themed Word Document containing a malicious macro that downloads an OpenVPN client on the targeted machine. The malware dropper establishes a connection to the legitimate cloud service “imgur” and downloads the NetWire or Remcos payloads by way of a technique called steganography, where the malicious code is hidden within an innocuous looking jpeg image file. ... The malware includes a variety of functions including the remote execution of shell commands on the infected machine, browser credential and history theft, the downloading and execution of additional malware payloads, screen captures and keylogging, as well as file and system management capabilities. Both NetWire and Remcos are commercial RATs that are available for online for as little as $10 per month, and both include following the Malware-as-a-Service (MaaS) model, offering their customers subscription-based services with choice of licensing plans, 24/7 customer support and periodic software updates.


Digital transformation: 4 strategy questions to ask

Internal buy-in is the most important aspect of any digital transformation and adoption strategy, and the easiest way to help promote that is to identify internal champions. Clearly defining the team responsible for the implementation of a new tool or process will help give an incentive for that team to ensure adoption is prioritized throughout the organization. It will also help clarify where employees can direct questions. ... Training teams on new processes and tools is easier said than done. It’s important to find a better way to train, not just to ensure that digital transformation is successful and to make sure training really sinks in, but also to make sure your employees feel supported. Building effective training programs is a great way to show your employees that you’re invested in their success and their careers more broadly – helping to increase retention. ... Finally, be sure to set measurable, attainable goals around your digital transformation strategy. These may look vastly different from tool to tool or organization to organization, but adoption will increase if every user understands how transformation efforts will be evaluated.


5 Ways Machine Learning Is Revolutionizing the Healthcare Industry

Machine learning established new methods in drug discovery, such as precision medicine and next-generation sequencing, which can ensure a drug has the intended effect on patients. With the implementation of machine learning techniques, medical experts can develop algorithms to treat disease progression and design specific treatments for each patient, like those with Type 2 diabetes. ... Machine learning aids medical experts in determining the risk for each patient, depending on their symptoms, past medical records, and family history. ML streamlines the process of finding treatments for evolving illnesses, as well as helping researchers to track possible pandemics and to understand better why some diseases are more prevalent in specific cultures and demographics. ... Modern hospitals are high-tech environments run by advanced machines and the staff who are trained. The hospitals are increasingly shifting towards automation, to a future where diagnoses can be made accurately. Machine learning can accelerate disease diagnostics and make the risk of misdiagnosis less likely.


Apps that help parents protect kids from cybercrime may be unsafe too

Parental control apps need many permissions to access particular systems and functions on devices. 80% of parental control apps request access to location, contacts and storage. While these permissions help the apps carry out detailed monitoring, some of them may not be necessary for the app to function as described. For instance, several apps designed to monitor children’s online activity ask for permissions such as “read calendar”, “read contacts” and “record audio” — none of which are justified in the app description or the privacy policy. Many are considered “dangerous permissions”, which means they are used to access information that could affect the user’s privacy and make their device more vulnerable to attack. For example, Boomerang requests more than 91 permissions, 16 of which are considered “dangerous”. The permission “access fine location” for instance, allows the app to access the precise geographic location of the user. The “read phone state” allows the app to know your phone number, network information and status of outgoing calls. It’s not just the apps that get that information. Many of these apps embed data hungry third-party software development kits (SDKs). SDKs are a set of software tools and programs used by developers to save them from tedious coding.


Company uses cognitive neuroscience to help train police officers

The brain has two systems, what you're consciously aware of, and then the non-conscious part, where almost everything happens. That's where your drives and biases and urges and impulses all come up from this, what we call the backchannel of the brain, the non-conscious part of the brain. Most of our actions and behaviors are initiated there a lot of times without our awareness. When we are thinking through something and thinking through answers or trying to problem-solve, we can direct our conscious brain to kind of override some of those impulses and urges and really take control of what we're doing and what we're thinking and how we're behaving. But when we're under stress, our brain is built for the non-conscious brain to take over, to help us with survival, or to help us get out of a scrape, something like that. With police officers, when they've been trained how to respond in certain ways to help people out, to de-escalate events, things like that, when stress starts to rise and get higher, then their non-conscious brain really starts to take over the processing. And that's where even really good officers can do things they later regret is because the non-conscious brain and those urges and impulses to say something or do something happens, and they might regret that later.


How 4 cities are modernizing their IT infrastructure through the cloud

The city's cybersecurity team leads threat management and operates a 24-hour security operations center. The team works with more than 100 city agencies and offices to ensure systems are built and operated in a secure manner to make sure public assistance and healthcare are not compromised. NYC Cyber Command also manages an NYC Secure app that alerts users to unsecure Wi-Fi networks, unsafe Android apps and system tampering. The team uses a cloud infrastructure to find and mitigate threats. The Cyber Command uses a variety of Google cloud services including Cloud Storage, Computer Storage, Kubernetes Engine and Workspace. The team uses BigQuery to analyze batch and streaming data. When the pandemic started, DC Water already had 90% of the organization's systems on the cloud, according to a blog post on Microsoft. The final step was moving in-person operations and services. The organization worked with ESRI to move applications, operational processes and customer requests to Azure. Goals for this work included improving data security and replacing paper processes with digital ones. Durmus Cesur, the manager of work and asset management for DC Water, told Microsoft in the blog post that Azure was the best solution to provide continuous availability and scalability.


Ransom Payments Have Nearly Tripled

A new report from Palo Alto Networks -- which uses data from ransomware investigations, data-leak sites, and the Dark Web — found 337 victims in 56 industries, with manufacturing, healthcare, and construction companies suffering 39% of ransomware attacks in 2020. In addition, ransom demands skyrocketed during the year, doubling both the highest ransom demand — to $30 million—and the highest-known paid ransom, $10 million. The average victim paid more than $312,000, almost a third of the average demand. ... The Palo Alto report combines two sources of the threat intelligence: 252 incidents investigated by the company's data-breach response service over the past two years, and a survey of public leak sites and the Dark Web. Almost two thirds of the incident response cases investigated by the company came in one of four industries in 2020: healthcare, manufacturing, information technology, or construction. ... "As organizations shifted to remote workforces due to the COVID-19 pandemic, ransomware operators adapted their tactics accordingly, including the use of malicious emails containing pandemic-based subjects and even malicious mobile apps claiming to offer information about the virus," the company stated.


Importance of Teaching Data Science in CS Programs

Besides being a lucrative career, data science is among the careers of tomorrow. New innovations in the industrial sectors are highly reliant on data. Technology is becoming dynamic and more data is generated as more people access the internet. With huge amounts of data, industries rely on data scientists to make smart business decisions. In the current digital world, data literacy is very important. People should learn how they can generate meaningful insights from raw data. Data is an untapped potential that can be used to develop various sectors. Fortunately, with the inception of machine learning technologies, organizations can predict and classify information accurately and intelligently. Data science, machine learning, and other similar technologies are subsets of artificial intelligence, which are the driving force behind future products such as self-driving cars and autonomous robots. Such developments are not fiction anymore. The emergence of reinforcement learning and natural language processing has also contributed to these advancements. ... The importance and urgency of data science in the 21st Century cannot be ignored. From providing great insights, statistics, aiding decision-making to hire suitable candidates, data science is overly valuable.


Five Steps To Thinking Like A Software Company

Leading companies feature software stacks that are modular, facilitating rapid innovation. Their developers frequently build in-house software products or platforms by leveraging free, but valuable, open-source software, as well as licensed components for routine functionality. This allows them to create applications faster. One executive stressed the importance of designing components with change in mind, because reconfiguring is always better than rewriting code. Another executive told me that every line of code within this decentralized architecture has a clear owner so that there is specific responsibility for each and every software component. To be clear, commercial solutions have an important role to play and should be a part of the software stack. But it’s the own-account software that matters most. ... In contrast, firms that lead with code typically begin by aiming to solve a focused business problem. They build and iterate on new features and products. Executives at these firms told me that until you try something out and see how your customers, suppliers, or employees react, and whether your business improves as a result, you can’t be sure of what to build.



Quote for the day:

"If you don't start somewhere, you're gonna go nowhere." -- Bob Marley

Daily Tech Digest - March 18, 2021

Confidential computing – the next frontier in security

“Confidential computing is a technique for securing data while in use by creating secure spaces that users rather than administrators’ control,” says Martin O’Reilly, director of research engineering at the Alan Turing Institute. “The idea is to create a trusted execution environment (TEE) or secure enclave, where the data is only accessible by a specific application or user, and only as the data is being processed.” ... Confidential computing’s ‘360-degree protection’ enables data to be processed within a limited part of the computing environment, giving organisations the ability to reduce exposure to sensitive data while also providing greater control and transparency, even allowing businesses to share data for joint processing securely. This represents a significant change, says O’Reilly, pointing out that the ability to create secure spaces where the user controls who has access to the data effectively replicates the trust companies might have in their own IT departments. He notes, however, that the advantages should be weighed against the complexities involved in setting up and managing these technologies.


Ethical AI – Can India be the leader?

The warning signs that have emerged should make the leadership team within all the financial institutions sit up and take serious notice. With enough success achieved on customer acquisition, it's now time to focus energies on creating systems that can serve the financial institutions' best interests and use AI as an effective tool for customer engagement rather than coercive intrusions. AI can indeed be moulded into a great tool for enhanced customer success, although the possibilities of this have not been fully explored. Ethical AI is a practice of evaluating the ethical quality of the impact of its prediction on human life. Debt collection is a good place to develop this practice because human debt collectors with performance targets typically find it hard to navigate the moral quagmire involved in dealing with financial stress. India, the land of spirituality, is well-positioned to become a leader in the practice of Ethical AI. The day is not far when the practice of Ethical AI will be a key differentiator for AI Platforms. On the policy side, countries across the globe are also working on Data Protection laws in the model of EU's GDPR, which would give customers the legal Right to Explanation.


The Progressive Web Apps Guide You’ve Been Looking For

PWA, or Progressive Web Apps, is one of the hottest topics in web development these days. It promises users a native-app like experience and features, but still built with common HTML, CSS, and JavaScript (or you can go even further by using any JavaScript frameworks), how cool is that! ... n simpler words, progressive web apps are typical web apps, built using the same old HTML, CSS, and JavaScript but with additional capabilities such as offline mode, A2HC (add to home screen), background syncing, push notifications, etc. They look and feel like a native apps but they ain’t! ... There are a few things to note about service worker: It’s a JavaScript Worker, so it can’t access the DOM directly. Instead, a service worker can communicate with the pages it controls by responding to messages sent via the postMessage interface, and those pages can manipulate the DOM if needed; Service worker uses a lot of promises, since it’s a script that runs in the background. What it mainly does is to listen to events that happen on the browser then provide a callback for each events we interested.


The path to .NET 5 and Blazor WebAssembly with some fun sprinkled in

One extremely interesting announcement was the new release of Blazor WebAssembly. Blazor lets you build interactive web UI wuth C# instead of JavaScript. Blazor WebAssembly allows users to build a completely isolated application entirely in C# that can run in nearly every web environment, including environments that only support static sites (think only HTML, CSS and Javascript). Blazor WebAssembly does this my compiling all C# code needed to run your application (your code and .NET libraries) into native code that will execute on the browser, not a server somewhere. This is valuable in scenarios when your app needs to run offline, or completely decoupled from a server, and your app only needs a server if it requests data outside of the application (similar to how many Javascript-based applications work). Due to the fact that the application runs in the browser, render times of the UI are near instantaneous, allowing for a great end-user experience. To see these benefits for ourselves, we decided to port a heavily used application to .NET 5 and Blazor WebAssembly and not only reap these benefits, but document the process one would take moving their existing .NET Core application using Blazor Server to .NET 5 using Blazor WebAssembly.


Are Data Engineering Jobs Getting More Popular Than Data Science Jobs?

While some of their activities might overlap, data engineers are primarily about moving and transforming data into pipelines for the data science team. Put it simply, data engineers have three critical tasks to perform — design, build and arrange data pipelines. In contrast, data scientists analyse, test, aggregate and optimise data. ... Data engineers essentially collect, generate, store, enrich and process data in real-time or in batches. Data engineering involves building data infrastructure and data architecture. Data engineers require experience in software engineering, programming languages, and a firm grip on core technical skills. Understanding ETL, SQL, and programming languages such as Java, Scala, C++, and Python are desired. ... The data science strategy in an organisation deals with data infrastructure, data warehousing, data mining, data modelling, data crunching, and metadata management, most of which are carried out by data engineers. Studies suggest most data science projects fall through as data engineers and data scientists find themselves at cross purposes. Many companies fail to recognise the importance of hiring data engineers. While most companies are starting to realise the importance of data engineers, the talent shortage is all too real.


Using Machine Learning in Testing and Maintenance

With machine learning, we can reduce maintenance efforts and improve the quality of products. It can be used in various stages of the software testing life-cycle, including bug management, which is an important part of the chain. We can analyze large amounts of data for classifying, triaging, and prioritizing bugs in a more efficient way by means of machine learning algorithms. Mesut Durukal, a test automation engineer at Rapyuta Robotics, spoke at Aginext 2021 about using machine learning in testing. Durukal uses machine learning to classify and cluster bugs. Bugs can be classified according to severity levels or responsible team or person. Severity assignment is called triage and important in terms of prioritization, where the assignment of bugs to the correct team or person prevents a waste of time. Clustering bugs helps to see whether they heap together on specific features. Exploring the available data on bugs with machine learning algorithms gave him more insight into the health of their products and the effectiveness of the processes that were used.


Digital Transformation Requires Redefining Role of Data Governance

Today’s data governance and data management practices must be redefined to support the organization’s business needs and ultimately underpin the organization’s data monetization strategy. A Data Monetization practice must: Evangelize a compelling vision regarding the economic potential of data and analytic assets to power an organization’s digital transformation; Educate senior executives, business stakeholders and strategic customers on how to “Think Like a Data Scientist” in identifying where and how data and analytics can deliver material business value; Apply Design Thinking and Value Engineering concepts in collaborating with business stakeholders to identify, validate, value and prioritize the organization’s high-value use cases that will drive the organization’s data and analytics development roadmap; Champion a Data Science team to “engineer” reusable, continuously-learning and adapting analytic assets that support the organization’s high priority use cases; Develop an analytics culture that synergizes the AI / ML model-to-human collaboration that empowers teams at the point of customer engagement and operational execution.


AI and IoT: Transforming Business

The execution of modern technologies such as Artificial Intelligence and IoT has been changing the entire business world. A recent survey stated that almost 500 IT professionals claiming AI and IoT as most emerging technologies do remodeling business operations and compelling companies to invest more to gain a competitive advantage. And the reasons are simple. The amalgamation of IoT and AI can create smart strategies that can read human preferences and help management make informed decisions with zero error. Not convinced? Let’s check it out with a real-life example. One of the world’s renowned car manufacturers, BMW, has started using AI and IoT in their manufacturing process. It uses censored robots in their premises which help workers while producing innovative cars. They are also leveraging AI for driverless cars for the future. In fact, AI and IoT are affecting the entire transportation industry. Interactive maps and smart route optimization is making it easy for drivers to reach the destination early. It saves fuel cost and reduces journey time. This is why you might have heard why entrepreneurs embrace AI solutions in their taxi app clone development because it plans routes based on peak hours and road construction.


10 Ways Enterprises Can Use the Edge

Edge use cases are expanding across industries as companies move compute and analytics capabilities to the edge. Some companies want to reduce latency. Others want to gain greater insights into what's happening in the field whether people, crops, or oil rigs. "Edge computing enables companies and other types of organizations to analyze large amounts of data on site or on devices in real time," said Shamik Mishra, CTO for Connectivity in the Engineering and R&D Business at global consulting firm Capgemini. "This can enable several new opportunities in terms of new sources of revenue, improved productivity, and decreased costs." In fact, there's an entire world of the Internet of Things (IoT) innovation happening that makes edge use cases even more compelling including smart homes, wearables, AR video games and increasingly intelligent vehicles. Gartner expects the IoT platform market to grow to $7.6 billion by 2024, which represents both on-premises and cloud deployments. The company considers PaaS a key enabler of digital scenarios. Allied Market Research sees the broader opportunity worth $16.5 billion by 2025, driven by the desire to avoid network latency problems and restrictions on bandwidth usage for storing data in the cloud...."


How to Maintain a Healthy Codebase While Shipping Fast

To determine what proportion of your sprint to allocate to tech debt, simply find the overlap between the parts of your codebase you'll modify with your feature work and the parts of your codebase where your worse tech debt lives. You can then scope out the tech debt work and allocate resources accordingly. Some teams even increase the scope of their feature work to include the relevant tech debt clean-up. More in this article 'How to stop wasting time on tech debt.' For this to work, individual contributors need to track medium-sized debt whenever they come across it. It is then the Team Lead's responsibility to prioritize this list of tech debt, and to discuss it with the Product Manager prior to sprint planning so that engineering resources can be allocated effectively. Every once in a while, your team will realize that some of the medium-sized debt they came across is actually due to a much larger piece of debt. For example, they may realize that the reason the front-end code is under-performing is that they should be using a different framework for the job. Left unattended, these large pieces of debt can cause huge problems, and — like all tech debt — get much worse as time goes by.



Quote for the day:

"No man is good enough to govern another man without that other's consent." -- Abraham Lincoln

Daily Tech Digest - March 17, 2021

Using Artificial Intelligence to Generate 3D Holograms in Real-Time on a Smartphone

By learning from each image pair, the tensor network tweaked the parameters of its own calculations, successively enhancing its ability to create holograms. The fully optimized network operated orders of magnitude faster than physics-based calculations. That efficiency surprised the team themselves. “We are amazed at how well it performs,” says Matusik. In mere milliseconds, tensor holography can craft holograms from images with depth information — which is provided by typical computer-generated images and can be calculated from a multicamera setup or LiDAR sensor (both are standard on some new smartphones). This advance paves the way for real-time 3D holography. What’s more, the compact tensor network requires less than 1 MB of memory. “It’s negligible, considering the tens and hundreds of gigabytes available on the latest cell phone,” he says. The research “shows that true 3D holographic displays are practical with only moderate computational requirements,” says Joel Kollin, a principal optical architect at Microsoft who was not involved with the research. He adds that “this paper shows marked improvement in image quality over previous work,” which will “add realism and comfort for the viewer.” 


Why 'Data Scientist' Will Continue To Be 'the Sexiest Job Of the 21st Century'

While demand for data science talent is through the roof, there are not enough skilled professionals available to take on those roles. One primary reason for this is the lack of clarity on the skills required for different roles within the field of data science. Most companies look for individuals possessing certain specialized skill sets rather than the ‘jacks-of-all-trades’. In order to prepare for the best opportunities and avoid getting tagged as a ‘generalist’, one needs to first appreciate the nuances that make these different roles unique. For instance, how is a data scientist different from a data engineer or a data analyst? Contrary to popular perception, these roles are not interchangeable. For instance, a data scientist is one who employs advanced data techniques such as clustering, neural networks, decision trees to help derive business insights. Apart from the requisite coding skills, data scientists typically need to be adept at programming languages such as Java, Python, SQL, R, and SAS. In addition, they require working knowledge of Big Data frameworks such as Hadoop, Spark, and Pig. Data scientists also need to be familiar with new technologies such as deep learning, machine learning, etc.


Containers need standard operating environments too

Even in the world of cloud native and containers, a standard operating environment matters. The set of criteria that should be used to evaluate container base images is quite similar to what we’ve always used for Linux distributions. Evaluate things like security, performance, how long the life cycle is (you need a longer life cycle than you think), how large the ecosystem is, and what organization backs the Linux distribution used. (See also: A Comparison of Linux Container Images.) Start with a consistent base image across your environment. It will make your life easier. Standardizing early in the journey lowers the cost of containerizing applications across an organization. Also, don’t forget about the container host. Choose a host and standardize on it. Preferably, choose the host that matches the standard container image. It will be binary compatible, designed and compiled identically. This will lower cognitive load, complexity of configuration management, and interactions between the application administrators and operations teams responsible for managing the fleet of servers underlying your containers.


Open Source Blockchain Microservices To Help You Build Your Own Blockchain

Like any microservice, the block store and p2p are simple (and easy to understand) programs. What makes these microservices special is that they are the first blockchain-specific microservices ever to be open-sourced. As you can see, we love open-source software development and decentralization but we’ve been saddened to see how many projects that claim to share these values pursue subtle (and not so subtle) ways of ensuring that they maintain control while developing most of their software behind closed doors. As one of the most experienced dApp and blockchain developers in the world, we understand better than most how difficult it can be to develop in the open, especially when you don’t have the right tools. So we’ve been delighted to find that by launching and designing Koinos the right way, we’ve found ourselves in a position where it makes perfect sense to continue developing it in the right way too; out in the open. The modularity of Koinos means that the more chefs that are in the kitchen, the better. We want developers to begin digging into the code as soon as possible and helping to make it into the protocol they truly need, instead of the protocol we believe they need.


Legacy tech integration issues impede telcos’ digitisation ambitions

While only a third of operators worldwide said finding the right talent was hindering their plans for digital development, the majority of such businesses were located in emerging markets such as Latin America, Africa and the Middle East. This, according to the survey, highlighted a pronounced skills gap between developed and emerging markets, with the latter still struggling to find the skills needed to facilitate digitisation. Just under half (46%) stated that cost was the biggest issue to realising transformation ambitions, suggesting that the path to digitisation was desirable and investment is ready. A surprisingly low number of telcos viewed return on investment (ROI) as a barrier, with only two-fifths of respondents expressing concerns that a return might not be easily established. Given that the telecommunications industry is traditionally very ROI-focused, the report authors said this suggested there was a great deal of confidence in the path towards digitisation if the aforementioned barriers could be overcome. In conclusion, the report said the findings implied a phased approach towards digitisation was in the best interests of telcos worldwide to ensure interoperability between technology and services and maintain what was described as a “seamless” customer experience.


The dangers of misusing instant messaging and business collaboration tools

The research shows this challenge is compounded by the amount of time employees spend using messaging and collaboration apps: time spent on tools such as Zoom and Teams has increased by 13% in the US since the start of the pandemic. This means employees are spending, on average, two and a half hours every day on these applications, with 27% of US employees spending more than half the working week on these tools. A significant amount of business is now routinely conducted on these channels and employees are taking agreements as binding. For example, as a result of receiving information over messaging and collaboration tools, almost 24% of US employees have accepted and processed an order, 25% have accepted a reference for a job candidate, and 20% have accepted a signed version of a contract. Sensitive data is being shared on these tools even though 39% of US employees have been reprimanded by bosses. These admonishments may have been in vain, however, as 75% of all US workers say they would continue to share this type of information in the future.


The Rise of the Chief Data Scientist

Right now, organizations are investing heavily in the chief data scientist role. This individual manages a range of data-driven functions, including overseeing data management, creating data strategy, and improving data quality. They also help their organizations extract the most valuable and relevant insights from their data, leveraging data analytics and business intelligence (BI). In this capacity, the chief data scientist has a far deeper understanding of how AI and machine learning (ML) can improve data management than the CTO, who has a broader knowledge base but not the deeper expertise. This is critical as ML has emerged as a key driver in improving data quality and access as navigating the journey from big data ideas to real-world machine learning implementation is a challenging endeavor. In this scenario, the chief data scientist serves as the trusted navigator, understanding that data is the fuel for key initiatives, knowing the non-deterministic risk of developing those capabilities. Moreover, this individual can manage the expectations of C-suite executives, helping them better understand the reality of what ML can accomplish while mitigating the risks associated with data-driven initiatives.


Enterprises Wrestle With Executive Social Media Risk Management

Companies know that their executives are targets. In our digital risk survey, we found that 25% of enterprises cite executives' personal social media as a major risk factor to the company's overall security. And they know that the consequences of an executive cyberattack would be severe. In our poll, 70% of respondents said their company would suffer brand or reputational damage. Half of the respondents predicted potential risk to shareholder value. One in three enterprises are most fearful of impersonation or fake accounts. One in four are most worried about the possibility of an account takeover. However, despite awareness of the threats, the sophistication of executive social media risk management is lagging. ... The new generation of cloud channels is very different. Tools like Twitter and LinkedIn live across multiple devices. They cross between professional and personal spheres. They generate interactions at unprecedented volume and velocity — and out of the box, security teams have no visibility. Today, all executives leverage social media, and they are bombarded by social media cybersecurity threats. Security teams know that banning these tools isn't an option. Why? Because people will use them anyway. 


Ways to Break Gender Gridlock in Cybersecurity Careers

In many ways, cybersecurity roles should be fair game for hiring and promotion because of the importance of code versus gender but that is not always the case in practice. “Behind the screen, in theory, everyone is equal,” she says. “Clearly that is not what is happening.” Guerrieri would like to see more networking among women in cybersecurity to facilitate the creation of support systems to encourage them to remain and thrive in this career path. Some women have seen opportunities in cybersecurity emerge in response to the pandemic, says Sabrina Castiglione, CFO at Tessian, an enterprise email security software provider. Her company recently conducted a survey that included responses from 200 female cybersecurity professionals, 100 in Britain and 100 in the United States. Castiglione says some of the responses showed an increased sense of job surety among women in cybersecurity as the world coped with the COVID-19 pandemic. “In cybersecurity, women are saying they feel more secure or that with the impact of the pandemic, their job security has actually increased,” she says. Of the women respondents to the survey, 49% felt more secure in their jobs, Castiglione says.


Cloud-Native Is about Culture, Not Containers

“What are we actually trying to achieve?” is an incredibly important question. When we're thinking about technology choices and technology styles, we want to be stepping back just from “I'm doing Cloud-native because that's what everybody else is doing” to thinking “what problem am I actually trying to solve?” To be fair to the CNCF, they had this “why” right on the front of their definition of Cloud-native. They said, "Cloud-native is about using microservices to build great products faster." We're not just using microservices because we want to; we're using microservices because they help us build great products faster. ... Cost savings, elasticity, and delivery speed are great, but we get all of that just by being on the Cloud. Why do we need Cloud-native? The reason we need Cloud-native is that a lot of companies found they tried to go to the Cloud and they got electrocuted. It turns out things need to be written differently and managed differently on the cloud. Articulating these differences led to the 12 factors. The 12 factors were a set of mandates for how you should write your Cloud application so that you didn't get electrocuted.



Quote for the day:

"Leadership involves finding a parade and getting in front of it." -- John Naisbitt

Daily Tech Digest - March 16, 2021

Lockdown one year on: what did we learn about remote working?

Securing millions of newly remote workers almost overnight was a huge undertaking. Against the need to keep businesses and essential services running (including public sector bodies like councils), security may not have been the primary considerations. Most organisations have now spent time going back to “plug the gaps”, but there’s no doubt that a proliferation of devices and the increased use of cloud services has left companies more vulnerable. McAfee found a 630% increase in attacks on cloud infrastructure since the start of the pandemic, and in just one month between March and April 2020, IBM recorded a 6,000% increase in phishing attempts. As well as ensuring remote/flexible working policies are up to date, there are a host of tactics companies can employ to address security. This includes mobile device management and endpoint security, strict patch management and complete backing up of the Microsoft 365 environment, which many assume is done automatically by Microsoft, but isn’t, which can result in a catastrophic loss of data. Another security approach is to focus on identity and access management (IAM) to enable single sign-on and smart identity management.


How Financial Institutions Can Deal with Unstructured Data Overload

Emerging big data analytics solutions which leverage machine learning (ML) can parse through data to identify important information. These tools allow financial institutions, particularly investment management firms uncover the crucial business insights that lie within the unstructured data, giving them an immediate competitive advantage over their peers that are not leveraging AI in this way. These analytics tools can uncover new market insights, allowing teams at investment management firms to get a deeper understanding of businesses and industries, allowing them to make better investment and trading decisions. For example, even after an investment management firm has holistically narrowed down the number of news articles necessary to review, there still might be thousands of texts to read through over the course of a month. Adding in an ML solution here would help the portfolio manager identify which stories are most relevant based on the language and nuanced phrasing within the text. It would give each article a relevant scoring, and save the PM the countless hours that they’d have otherwise spent reading through the articles.


Proving who you are online is still a mess. And it's not getting better

For the past two decades, the UK government has looked at ways to enable people to easily and reliably identify themselves, with little success. Unlike in other countries, a national ID card to carry around in your pocket now seems to be firmly off the table; but instead, the concept of creating a "digital identity" is gathering pace. Rather than digging through piles of archived paper-based documents, a digital identity would let people instantly prove certified information about themselves, flashing their credentials, for instance, through an app on their phone. Although the concept is not new, the idea is gaining renewed attention. The Department for Digital, Culture, Media and Sports (DCMS), in fact, recently unveiled plans to create what it called a digital identity "trust framework". The idea? To lay down the ground rules surrounding the development of new technologies that will allow people to prove something about themselves digitally. This could take the form of a digital "wallet", which individuals could keep on their devices and fill with any piece of information, or attributes about themselves that they deem useful. The wallet could includes basic information like name, address or age, but also data from other sources, at the user's own convenience.


UK Set to Boost Cybersecurity Operations

Johnson has said in Parliament that the creation of the NCF is designed to strengthen Britain's cybersecurity posture and give the country new defensive and offensive capabilities. "Our enemies are also operating in increasingly sophisticated ways, including in cyberspace," Johnson says. "Rather than being confined to some distant battlefield, those that seek to do harm to our people can reach them through the mobile phones in their pockets or the computers in their homes. To protect our citizens, U.K. defense therefore needs to operate at all times with leading, cutting-edge technology." Currently, the NCF carries out operations such as interfering with a mobile phone to prevent a terrorist being able to communicate with their contacts; helping to prevent cyberspace from being used as a global platform for serious crimes, including the sexual abuse of children; and keeping U.K. military aircraft safe from targeting by weapons systems. In addition to the NCF, last year the Ministry of Defense created the 13th Signals Regiment, the U.K.'s first dedicated cyber regiment, and expanded the Defence Cyber School. While he acknowledged the benefits of a more cyber-capable military, Cracknell pointed out that, "We don’t have a solid security foundation, and until all businesses and CNI entities are at that level, we are wasting resources by going on the offensive."


DDoS's Evolution Doesn't Require a Security Evolution

The idea of monetizing DDoS attacks dates back to the 1990s. But the rise of DDoS-for-hire services and cryptocurrencies has radically changed things. "It's never been easier for non-specialists to become DDoS extortionists," Dobbins explains. This has led to a sharp uptick in well-organized, prolific, and high-profile DDoS extortion campaigns. Today, cybercrime groups deliver ransom demands in emails that threaten targets with DDoS attacks. Most of these are large attacks above 500 gigabytes per second, and a few top out at 2 terabytes per second. Ransom demands may hit 20 Bitcoin (approximately $1 million). Attacks that revolve around ideological conflicts, geopolitical disputes, personal revenge, and other factors haven't disappeared. But the focus on monetization has led attackers to increasingly target Internet service providers, software-as-a-service firms and hosting/virtual private server/infrastructure providers. This includes wireless and broadband companies. "We've seen the DDoS attacker base both broaden and shift toward an even younger demographic," Dobbins says. According to Neustar's Morales, reflection and amplification attacks continue to be the most prominent because of their inherent anonymity and ability to reach very high bandwidth without requiring a lot of attacking hosts.


Securing a hybrid workforce with log management

When companies shifted to a remote workforce in response to the COVID-19 pandemic, cybercriminals continued to launch attacks. However, they did not target distantly managed corporate networks. Instead, they looked to exploit organizations where workforce members did their jobs on home networks and devices. Because home networks often lack the robust security controls that the enterprise uses, they become attractive gateways for malicious actors. During the COVID-19 lockdowns, cybercriminals increasingly leveraged the Windows Remote Desktop Protocol (RDP) as an attack vector. RDP allows users to connect remotely to servers and workstations via port 3389. However, misconfigured remote access often creates a security risk. There has been a massive increase in RDP attack attempts in 2020. Windows computers with unpatched RDP can be used by malicious actors to move within the network and deposit malicious code (e.g., ransomware). Devices getting infected with malware is a common occurrence when users work outside the corporate network. Since IT departments cannot push software updates through to the devices, security teams need to monitor for potential malware infections. Event logs can detect potentially malicious activity when used correctly.


Cryptophone Service Crackdown: Feds Indict Sky Global CEO

Sky Global's CEO has disputed those allegations and said he has received no direct notice of any charges being filed against him or any extradition request. "Sky Global’s technology works for the good of all. It was not created to prevent the police from monitoring criminal organizations; it exists to prevent anyone from monitoring and spying on the global community," Eap says in a statement released Sunday and posted to the company's website. ... "The unfounded allegations of involvement in criminal activity by me and our company are entirely false. I do not condone illegal activity in any way, shape or form, and nor does our company." Eap has also disputed claims by police that they cracked Sky Global's encryption. Previously, Sky Global had offered a $5 million reward to anyone able to demonstrate that they had cracked the encryption. Following a two-year investigation into Sky Global and its customers, last week, police in Belgium, France and the Netherlands launched numerous house searches, leading to hundreds of arrests of alleged users - including three attorneys in Antwerp, Belgium - as well as the seizure of thousands of kilograms of cocaine and methamphetamine, hundreds of firearms, millions of euros in cash as well as diamonds, jewelry, luxury vehicles and police uniforms, officials say.


Optimize your CloudOps: 8 tricks CSPs don't want you to know

Leveraging security managers that span all your traditional systems and public clouds is three times more effective than following a cloud-native approach. Similar to tip No. 1 above, cloud-native security systems operate best on their native cloud. Eventually you'll have silos of security systems, each solving tactical security problems for their native clouds. What you need is an overarching security ops platform that can manage security from cloud to cloud as well as for traditional systems, and perhaps with emerging technologies such as edge computing. Again, this is about finding something "cross-cloud" that exists today, and to do that you'll have to look for third-party providers. If you don't choose cross-cloud security now, the move from cloud-native to cross-cloud security will happen when your security silos become too complex to maintain and the first breach occurs. At that point, the transformation from cloud-native to cross-cloud security is difficult and costly. While this trick causes some debate from time to time, most experts agree: Abstracting public clouds for performance monitoring is a much better approach than just monitoring a single cloud using its cloud-native system.


AI One Year Later: How the Pandemic Impacted the Future of Technology

Those changing consumer behaviors created an abrupt reality for data science teams: predictive AI and machine learning (ML) models and the data they are derived from were almost instantly outdated, and in many cases reduced to irrelevance. In the past, these models were based on historical data from several years of behavioral patterns. But in a world of tightened spending, limited purchasing options, changing demand patterns, and restricted engagement with customers, that historical data no longer applied. To combat this problem -- at a time when companies could not afford inaccurate predictions or lost revenue -- AI teams turned to such solutions as real-time, ever-changing forecasting. By constantly updating and tuning their predictive models to include incoming data from the new pandemic-driven patterns, organizations were able to reduce data drift and more effectively chart their paths through the crisis and recovery period. With their hand forced, companies needed to make difficult choices during the spring of 2020. Do they put their projects and initiatives on pause and wait for the pandemic to subside, or push forward in applying AI as a competitive differentiator during these challenging times?


What is Agile leadership? How this flexible management style is changing how teams work

As Agile development took hold in IT departments, so tech chiefs started thinking about how the approach could be used – not just to create software products – but to lead teams and projects more generally. As this happened, CIOs started talking about the importance of Agile leadership. Over the past decade, the use of Agile as a technique for leading and completing projects has moved beyond the IT department and across all lines of business. The increased level of collaboration between tech organisations and other functions, particularly marketing and digital, has helped to feed the spread of Agile management. ... Although Agile leadership leans heavily on the principles and techniques of Agile software development, such as iteration, standups and retrospectives, it's probably fair to say that it's a management style that involves a general stance rather than a hard-and-fast set of rules. Mark Evans, managing director of marketing and digital at Direct Line, says the key to effective Agile management is what's known as servant leadership, a leadership philosophy in which the main goal of the leader is to serve.



Quote for the day:

"Integrity is the soul of leadership! Trust is the engine of leadership!" -- Amine A. Ayad