Daily Tech Digest - January 25, 2021

DDoS Attackers Revive Old Campaigns to Extort Ransom

Radware's researchers say the tactics recently observed with the attacks launched by this particular group indicate a fundamental change in how it operates. Previously, the operators would target a company or industry for a few weeks and then move on. The 2020-2021 global ransom DDoS campaign represents a strategic shift from these tactics. DDoS extortion has now become an integral part of the threat landscape for organizations across nearly every industry since the middle of 2020," the report states. The other major change spotted is this threat group is no longer shy about returning to targets that initially ignored their attack or threat, with Radware saying companies that were targeted last year could expect another letter and attack in the coming months. "We asked for 10 bitcoin to be paid at (bitcoin address) to avoid getting your whole network DDoSed. It's a long time overdue and we did not receive payment. Why? What is wrong? Do you think you can mitigate our attacks? Do you think that it was a prank or that we will just give up? In any case, you are wrong," the second letter says, according to Radware. "The perseverance, size and duration of the attack makes us believe that this group has either been successful in receiving payments or they have extensive financial resources to continue their attacks," the report states.


Five Reasons You Shouldn't Reproduce Issues in Remote Environments

When attempting to reproduce an issue across multiple environments, one area that teams must have solid processes around is test data management. Test data can be critical in the reproduction of bugs in that if you don’t have the right test data in your environment, the bug may not be reproducible. Due to the sheer size of production data sets, teams must often work with subsets of that data across test environments. The holy grail of test data management processes is to allow teams to easily quickly subset production data based on the data needed to reproduce an issue. In practice, things don’t always work out so easily. It’s hard to know what attributes of your test data may be influencing a specific bug. In addition, data security when dealing with PII data can be a major challenge when subsets of data are used across environments. Teams need to ensure that they are in compliance with corporate data privacy standards by masking or generating new relevant data sets. Many times it takes lots of logging and hands on investigation to uncover how data discrepancies can cause those hard to find bugs. If you cannot easily manage and set up test data on demand, teams will suffer the consequences when it comes to trying to reproduce bugs in remote environments.


AI ethics: Learn the basics in this free online course

If you are interested, an excellent place to start might be the free online course The Ethics of AI, offered by the University of Helsinki in partnership with "public sector authorities" in Finland, the Netherlands, and the UK. Anna-Mari Rusanen, a university lecturer in cognitive science at the University of Helsinki and course coordinator, explains why the group developed the course: "In recent years, algorithms have profoundly impacted societies, businesses, and us as individuals. This raises ethical and legal concerns. Although there is a consensus on the importance of ethical evaluation, it is often the case that people do not know what the ethical aspects are, or what questions to ask." Rusanen continues, "These questions include how our data is used, who is responsible for decisions made by computers, and whether, say, facial recognition systems are used in a way that acknowledges human rights. In a broader sense, it's also about how we wish to utilize advancing technical solutions." The course, according to Rusanen, provides basic concepts and cognitive tools for people interested in learning more about the societal and ethical aspects of AI. "Given the interdisciplinary background of the team, we were able to handle many of the topics in a multidisciplinary way," explains Rusanen.


Zero trust: A solution to many cybersecurity problems

CISOs of organizations that have been hit by the attackers are now mulling over how to make sure that they’ve eradicated the attackers’ presence from their networks, and those with very little risk tolerance may decide to “burn down” their network and rebuild it. Whichever decision they end up making, Touhill believes that implementing a zero trust security model across their enterprise is essential to better protect their data, their reputation, and their mission against all types of attackers. And, though a good start, this should be followed by the implementation of the best modern security technologies, such as software defined perimeter (SDP), single packet authorization (SPA), microsegmentation, DMARC (for email), identity and access management (IDAM), and others. SDP, for example, is an effective, efficient, and secure technology for secure remote access, which became one of the top challenges organizations have been faced with due to the COVID-19 pandemic and the massive pivot from the traditional office environment to a work-from-anywhere environment. Virtual private network (VPN) technology, which was the initial go-to tech for secure remote access for many organizations, is over twenty years old and, from a security standpoint, very brittle, he says.


Comparing Different AI Approaches to Email Security

Supervised machine learning involves harnessing an extremely large data set with thousands or millions of emails. Once these emails have come through, an AI is trained to look for common patterns in malicious emails. The system then updates its models, rules set, and blacklists based on that data. This method certainly represents an improvement to traditional rules and signatures, but it does not escape the fact that it is still reactive and unable to stop new attack infrastructure or new types of email attacks. It is simply automating that flawed, traditional approach – only, instead of having a human update the rules and signatures, a machine is updating them instead. Relying on this approach alone has one basic but critical flaw: It does not enable you to stop new types of attacks it has never seen before. It accepts there has to be a "patient zero" – or first victim – in order to succeed. The industry is beginning to acknowledge the challenges with this approach, and huge amounts of resources – both automated systems and security researchers – are being thrown into minimizing its limitations. This includes leveraging a technique called "data augmentation," which involves taking a malicious email that slipped through and generating many "training samples" using open source text augmentation libraries to create "similar" emails.


Why Is Agile Methods Literacy Key To AI Competency Enablement?

First, quality AI is a highly iterative experimentation, design, build and review process. Organizations that are aspiring to build strong AI and data sciences competency centers will flounder if their core cultures are not building agile skills into all operating functions, from top to bottom. Given the incredible speed and uncertainties of everything becoming more digital and smarter, the imperative for all talent to continually adapt, reflect, and make decisions based on new information is a business imperative. Leaders do not have the luxury to procrastinate too long before acting on the new insights, and making decisions with confidence. Some times, cultures can build a capacity for inaction versus action oriented behavior. Agile leadership demands rapid precision, involving diverse stakeholders, which in turn, yields more positive change dynamics (momentum) and more importantly innovation capacity grows as a result of this energy force. In a recent Harvard article, the authors pointed out that, “If people lack the right mindset to change and the current organizational practices are flawed, digital transformation will simply magnify those flaws.” Truly agile organizations are able to capitalize on new information and make the next move because they have what we call the capacity to act.


10 ways to prep for (and ace) a security job interview

Hiring managers typically look for strong technical skills and specific cybersecurity experience in the candidates they want to interview, particularly for candidates filling entry- and mid-level positions within enterprise security. But managers use interviews to determine how well candidates can apply those skills and, more specifically, whether candidates can apply those skills to support the broader objectives of the organization, says Sounil Yu, CISO-in-resident at YL Ventures. As such, Yu says he and others look for “T-shaped individuals”—those with deep expertise in one area but with general knowledge across the broader areas of business. The candidates who get job offers are those who have, and demonstrate, both. “Security is a multidisciplinary problem, so that depth is an important asset,” Yu adds. Candidates love to say they’re passionate about security, but many can’t figure out how to showcase it. Those who can, however, stand out. Yu once interviewed a candidate via video and could see a server rack in the background of this person’s home office. “He clearly liked tinkering outside of work. You could see that he had tech skills and a passion for them and a drive to learn about new technologies,” Yu says. 


The changing role of IT & security – navigating remote work cybersecurity threats

The move to remote working and the complication of multiple devices and locations is also raising the important questions related to software licensing. Are you licensed for the apps that people are using at home, or are you licensed on their computer in the office and on their computer at home? Several businesses are now having to buy thousands of additional software licenses so that employees can work on more than one computer, at a time when cost optimisation is extremely important. One of the related threats to businesses is running afoul of regulatory data privacy protections like GDPR and CCPA, among others. Given the current state of things, it is unlikely that a regulator would currently be hunting for companies that might be improperly managing employee and customer data. It appears regulators are largely being more lenient at this stage while companies are busy just trying to survive. Whilst it is reasonably to consider that, for a time, this will continue, there will come a time when we see a return to enforcement and, in the meantime, there is no guarantee that regulators will not review issues that come up as a result of a data breach or loss. It’s always important to reinforce the best security practices to your workforce, but it is especially important when your employees are out of their normal routines.


Weighing Doubts of Transformation in the Face of the Future

You don’t have to [change], but you will be left behind. Seventy-four percent of CEOs believe that their talent force and organization need to be a digitally transformed organization, yet they feel like only 17% of their talent is capable and ready to do that. That gap is glaring. That’s coming from the tops of organizations and businesses. The first mover advantage has kind of passed already. Now we’re getting into the phase of cloud migration and the concept of everything-as-a-service. Digital transformation is easier to attain. You don’t have to be the first mover or early adopter. The companies that help you live, work, and play inside your home were pretty resilient during the COVID-19 pandemic. Tech, media, and fitness companies like NordicTrack and Peloton that helped you stay inside your house, they were the ones that needed to transform digitally immediately to deal with the significant increase in demand along with significant supply chain challenges. Now we are seeing other industries that saw a bit of a pause during COVID -- consumer, travel, entertainment, energy -- those businesses are seeing or expecting this uptick in the summer travel period, the pent-up demand of Americans. Interest rates are very low, and they haven’t been able to spend [as much] money for the last 12 to 18 months by the time the summer comes around.


Good News: Cryptocurrency-Enabled Crime Took a Dive in 2020

While the total cryptocurrency funds received by illicit entities declined in 2020, Chainalysis reports, criminals continue to love cryptocurrency - with bitcoin still dominating - because using pseudonymizing digital currencies gives them a way to easily receive funds from victims. Cryptocurrency also supports darknet market transactions, with many markets offering escrow services to help protect buyers and sellers against fraud. Using cryptocurrency, criminals can access a variety of products and services, such as copies of malware or hacking tools, complete sets of credit card details known as fullz, and tumbling or mixing services, which are provided by a third-party service or technology that attempts to mix bitcoins by routing them between numerous addresses, as a way of laundering the bitcoins. Criminals have also been using a legitimate concept called "coinjoin," which is sometimes built into cryptocurrency wallets as a feature. It allows users to mix virtual coins together while paying for separate transactions, which can complicate attempts to trace any individual transactions. Intelligence and law enforcement agencies have some closely held ability to correlate the cashing out of cryptocurrency with deposits that get made into individuals' bank accounts.



Quote for the day:

"To have long term success as a coach or in any position of leadership, you have to be obsessed in some way." -- Pat Riley

Daily Tech Digest - January 24, 2021

The work-from-home employee’s bill of rights

Keeping business and personal data separate is straightforward for most cloud services, so legitimate security concerns can be addressed in such hybrid environments. Only in areas where IT cannot reasonably ensure security may businesses disallow specific optional technologies or hybrid usage. (The employee should be made aware that in such mixed-usage cases that, should there ever be a legal proceeding, their personal devices used for work could be subject to discovery and thus be taken during the course of an investigation.) IT also must allow the use of personal services in such mixed-usage environments, such as allowing users to use personal Slack, Zoom, or Skype accounts for personal communications rather than blocking such software to force the use of a corporate standard. Instead, managers would enforce the use of corporate-standard technology for business purposes, not IT through technology barriers. The basic principle should be that employees can bring their own technology into the mix unless it creates a clear security issue — and not a theoretical one, since IT too often cites security as an easy reason to say no to employee requests despite any real evidence of a risk.


Artificial Intelligence Collaboration in Asia’s Security Landscape

Though the field of AI – a catchall term for a set of technologies that enable machines to perform tasks that require human-like capabilities – has been around for decades, interest in it has surged over the past few years, including across the Asia-Pacific, with individual countries beginning to develop their own national approaches and multilateral groupings such as the OECD formulating guidance such as principles on AI. In the security realm more specifically, AI is emerging as a key topic for defense policymakers and communities alike in a range of areas, from assessments of its impact on geopolitical competition to areas of potential collaboration between some Indo-Pacific partners and their expert communities. It has also been a topic of discussion among scholars and policymakers in annual Asian security fora such as the Shangri-La Dialogue and the Xiangshan Forum. Seen from this perspective, Mohamad’s highlighting of AI as an area of focus for Asian defense establishments was very much in keeping with these trends. As he noted in his keynote address, AI represents an emerging domain where armed forces and defense establishments can play a key role in efforts to “strengthen the international order and enhance practical cooperation” by promoting responsible state behavior, building confidence, and fostering international stability. 


Is neuroscience the key to protecting AI from adversarial attacks?

For the new research, Cox and DiCarlo joined Joel Dapello and Tiago Marques, the lead authors of the paper, to see if neural networks became more robust to adversarial attacks when their activations were similar to brain activity. The AI researchers tested several popular CNN architectures trained on the ImageNet dataset, including AlexNet, VGG, and different variations of ResNet. They also included some deep learning models that had undergone “adversarial training,” a process in which a neural network is trained on adversarial examples to avoid misclassifying them. The scientist evaluated the AI models using the BrainScore metric, which compares activations in deep neural networks and neural responses in the brain. They then measured the robustness of each model by testing it against white-box adversarial attacks, where an attacker has full knowledge of the structure and parameters of the target neural networks. “To our surprise, the more brainlike a model was, the more robust the system was against adversarial attacks,” Cox says. “Inspired by this, we asked if it was possible to improve robustness (including adversarial robustness) by adding a more faithful simulation of the early visual cortex — based on neuroscience experiments — to the input stage of the network.”


Speed Limits in Software Development

For software development there aren’t road signs telling us a safe speed to deploy at, but perhaps we can extend the driving metaphor a bit more to help us think this through. One thing that relates to safe speed is responsiveness. A slick road makes it harder for your car to respond to changes in direction, and slow deployment makes it hard to respond to problems with your application. How easy is it to respond to issue in your application? Don’t forget that an F1 race car with new tires and perfect tuning can respond a lot better than the little commuter car you might have. We can tune our code and deployments and get better at responsiveness over time. If the road is foggy and you can’t see where you are going when you drive, I hope you slow down. If you can’t see what is going in your application and understand how it is being used, I hope you slow down. ... So how fast can we go in software development? Well, in the ideal case if we know everything and have a smooth path ahead of us, pretty fast. I don’t think we can get to a land speed record since software development doesn’t often involve going in a straight line, but with a bit of work on the code and deployment process and with investment in observability and operations, I think we can go pretty fast, pretty safely. Just be careful.


Why the brain will always win in the battle against AI

What we call ‘intelligence’ is an activity of the brain. The outcome of that activity forms our ‘mind’ about things. Even when we sleep, our intelligence is awake and our mind is being formed. In this context, we must pay attention to the concept of duality as the first level of multivariate analysis. A hallmark of intelligence is the willingness to change one's mind. Humans can think in terms of ranges, options and spectral possibilities. Machines are only about specificity and exactness. Computing doesn’t entertain opinion. Yet, calculation is merely one aspect of our mental ability. It has been exaggerated in our education system. This kind of logic-based intelligence is quite self-conscious. We are assessed for deductive ability. We are tutored to think and know but not trained to ‘think about thinking’ or ‘know about not knowing’. We are barely taught any self-awareness. Emotional Intelligence is neglected. We are coached in analytical hindsight and acquire a punter’s foresight based on the computation of odds. No one educates us on esteem, gratification, empathy, or seduction. We learn these things by ourselves. The irony is that machines have beaten us on all those aspects that we acquire via structured learning and tutoring. It is in the emotional, subjective and artistic areas that mankind holds the advantage.


Data bill: The security vs privacy debate

Encryption is widely acknowledged as the strongest feature of data protection. Digital banking and financial transactions have increased manifold with the Reserve Bank of India prescribing the encryption standards. The telecom sector, however, is limping along on 40-bit key encryption, which is considered to be low. Both cellular voice and messaging are vulnerable to off-air interceptions, with experts pointing at the weakness of SMS being used as second factor authentication in banking, payments and Aadhaar identification. The Telecom Regulatory Authority of India has rightly recommended an update of regulation policy and is of the view that encryption is a reliable tool which should not be interfered with. The end-to-end encryption on chat platforms is the most secure method of keeping data safe from hackers and break-ins. The General Data Protection Regulation of the European Union strongly favours use of encryption for protecting individual data. However, security agencies around the world want decrypted data and favour legislation in this regard. The United States, United Kingdom and Australia support a legislation for decryption, while France and Germany are pro-encryption.


New motto for CIOs: Move even faster and make sure nothing breaks

The stakes are high for IT professionals running digital transformation projects with consequences ranging from missed bonuses to going out of business, according to a new survey. The current motto for survival is "Move even faster and make sure nothing breaks," IT leaders told Kong in the company's 2021 Digital Innovation Benchmark report. Sixty-two percent of tech leaders said they are at risk of being replaced by competitors who innovate more quickly, according to the survey. Also, 51% of respondents said they will survive only three years before being acquired or simply going out of business if they can't evolve fast enough. That number goes up to 84% when the make-or-break timeline extends to six years. This number is up from 71% in last year's survey. ... The survey reinforces what many companies realized at the end of 2020: The pandemic accelerated digital transformation in general and cloud migrations in particular. Almost 40% of tech leaders in the US and Europe said that their companies also implemented microservices sooner than expected due to the pandemic.  A majority of respondents (87%) said that microservice-based applications, distributed applications, and open source software are the future of IT architecture. 


Reflect brings automated no-code web testing to the cloud

Every company is now a software company, or so we’re told, meaning they have to employ designers and developers capable of building websites and apps. In tandem, the much-reported software developer shortage means companies across the spectrum are in a constant battle for top talent. This is opening the doors to more automated tools that democratize some of the processes involved in shipping software, while freeing developers to work on other mission-critical tasks. It’s against this backdrop that Reflect has come to market, serving as an automated, end-to-end testing platform that allows businesses to test web apps from an end user’s perspective, identifying glitches before they go live. Founded out of Philadelphia in 2019, the Y Combinator (YC) alum today announced a $1.8 million seed round of funding led by Battery Ventures and Craft Ventures, as it looks to take on incumbents with a slightly different proposition. Similar to others in the space, Reflect hooks into the various elements of a browser so it can capture actions the user is taking, including scrolls, taps, clicks, hovers, field entry, and so on. This can be replicated later as part of an automated test to monitor the new user signup flow for a SaaS app, for example.


Immigration exemption in data protection law faces further legal challenge

Speaking to Computer Weekly about the appeal, Scotland director at ORG Matthew Rice said the exemption, which is the first derogation of its kind in 20 years of UK data protection law, has been justified by the UK government on the grounds it needs to “stop people from learning that they’re about to be removed from the country” and consequently absconding. “There was no evidence to suggest that under previous data protection law…people were making subject access requests [SARs], getting back that they were due to get a visit from the immigration services, and then running away,” he said. “The other thing to bear in mind is that the exemption is blunt because immigration control isn’t defined in the act or in any part of UK law, and it’s not just about the Home Office or borders. Any data controller can apply this exemption – it’s available to your doctor, your landlord, your school, your local authority, any number of persons that might hold personal data about you.” ... The non-disclosure of personal data under the immigration exemption therefore not only interferes with the individual’s access rights, but a host of other digital rights granted by the GDPR as well, including the rights to rectification, erasure and restriction of processing.


How security pros can prepare for a tsunami of new financial industry regs in 2021

Biometrics can add an extra layer of security when unlocking a smartphone using a person’s face or fingerprint. But other technologies have raised privacy concerns among consumers, such as law enforcement leveraging facial recognition to identify wanted criminals via security cameras in a public space. This has led to outright bans of facial recognition technology in several cities, including Boston, San Francisco, Oakland, Portland, Oregon and Portland, Maine, to name a few. As these technologies become mainstream, we’ll need regulations to retain (or in some cases, regain) the trust of consumers and policymakers. As a step forward, we see international organizations push for global standards around the use of biometrics, for example, the FIDO Alliance and the Financial Action Task Force (FATF), which recently issued guidance on how to apply a risk-based approach to using digital identity systems for customer identification and verification. However, the U.S. lags behind other regions, which have been more progressive in their adoption of regulations, such as the General Data Protection Regulation (GDPR) in Europe. In lieu of federal standards, states such as California have implemented their own regulations, such as the California Consumer Protection Act (CCPA) and its upgrade, the California Privacy Rights Act (CPRA). 



Quote for the day:

"The first step of any project is to grossly underestimate its complexity and difficulty." -- Nicoll Hunt

Daily Tech Digest - January 22, 2021

Why it's vital that AI is able to explain the decisions it makes

The effort to open up the black box is called explainable AI. My research group at the AI Institute at the University of South Carolina is interested in developing explainable AI. To accomplish this, we work heavily with the Rubik’s Cube. The Rubik’s Cube is basically a pathfinding problem: Find a path from point A – a scrambled Rubik’s Cube – to point B – a solved Rubik’s Cube. Other pathfinding problems include navigation, theorem proving and chemical synthesis. My lab has set up a website where anyone can see how our AI algorithm solves the Rubik’s Cube; however, a person would be hard-pressed to learn how to solve the cube from this website. This is because the computer cannot tell you the logic behind its solutions. Solutions to the Rubik’s Cube can be broken down into a few generalized steps – the first step, for example, could be to form a cross while the second step could be to put the corner pieces in place. While the Rubik’s Cube itself has over 10 to the 19th power possible combinations, a generalized step-by-step guide is very easy to remember and is applicable in many different scenarios. Approaching a problem by breaking it down into steps is often the default manner in which people explain things to one another.


Why KubeEdge is my favorite open source project of 2020

The KubeEdge architecture allows autonomy on an edge computing layer, which solves network latency and velocity problems. This enables you to manage and orchestrate containers in a core data center as well as manage millions of mobile devices through an autonomous edge computing layer. This is possible because of how KubeEdge uses a combination of the message bus (in the Cloud and Edge components) and the Edge component's data store to allow the edge node to be independent. Through caching, data is synchronized with the local datastore every time a handshake happens. Similar principles are applied to edge devices that require persistency. KubeEdge handles machine-to-machine (M2M) communication differently from other edge platform solutions. KubeEdge uses Eclipse Mosquitto, a popular open source MQTT broker from the Eclipse Foundation. Mosquitto enables WebSocket communication between the edge and the master nodes. Most importantly, Mosquitto allows developers to author custom logic and enable resource-constrained device communication at the edge.


DevOps, DevApps and the Death of Infrastructure

The godfather of the DevOps movement, Patrick Debois, often speaks about how we are moving to a more service-oriented or serviceful intranet. I have been calling this riff on DevOps deployment methodology, DevApps. This is an emerging design pattern where cloud native applications are a combination of bespoke services (like Twilio, Salesforce, and many others) alongside custom software deployed as functions on scale-to-zero web services like Amazon Lambda. Services are being managed with Terraform, just as the services of the past had been managed by Chef or Puppet. Once organizations tackle the well-accepted practice to automate deployment, the next frontier is to create applications that are composable via automated means. What we’re talking about here is layering integration-as-code on top of infrastructure-as-code. With a wide variety of cloud services at their disposal, application developers need not worry about the latter — just the former. At TriggerMesh, we are seeing more and more organizations looking to create applications that are configured with automated workflows on the fly.


5 Qualities Of Highly Engaged Teams

Trust is not just the cornerstone of leadership. It is also a fundamental building block in high-performance teams. When teams trust each other, it gives them more confidence in their abilities. They know they will get support when needed. Also, they will be willing to provide support to teams in need. This collaboration and cooperation help the sharing of best practices, which brings the level of the whole team, or teams higher. Trust is one of those reflexive qualities; the more the leader shows trust, the more they will be trusted. The more we trust our teams, the more they will trust themselves and each other. Leaders need to be the role model when it comes to this but also need to go that extra step to providing support and also to ask for it. Leaders who can show this vulnerability make it ok for their teams to ask for help when needed, as well as give it. Teams that consistently deliver are teams that feel empowered, teams that understand what needs to be done and have the tools to achieve it. This empowerment boost self-confidence and belief that the teams will reach their goals. Being engaged is great, but if you’re empowered, this can lead to frustration and disengagement.


Four key real world intelligent automation trends for 2021

In 2021, there will be an overdue re-think of how organisations choose RPA and intelligent automation technologies. We’ll see greater selection rigour fuelling more informed assessments of these technologies’ abilities to successfully operate and scale in large, demanding, front-to back-office enterprise environments, where performance, security, flexibility, resilience, usability, and governance are required. ... For a RPA or intelligent automation programme to really deliver, a strategy and purpose is needed. This could be improving data quality, operational efficiency, process quality and employee empowerment, or enhancing stakeholder experiences by providing quicker, more accurate responses. By examining the experiences and proven outcomes experienced by those organisations with mature automation programs, we’ll see more meaningful methods of measuring the impact of RPA and intelligent automation. ... This year, there will also be a greater understanding of which vendor software robots really possess the ability to be ‘the’ catalyst for digital transformation. These robots are typically pre-built, smart, highly productive and self-organising processing resources, that perform joined up, data-driven work across multiple operating environments of complex, disjointed, difficult to modify legacy systems and manual workflows.


Why North Korea Excels in Cybercrime

The cybercrime market's size and the scarcity of effective protection continue to be a mouth-watering lure for North Korean cyber groups. The country's cyber operations carry little risk, don't cost much, and can produce lucrative results. Nam Jae-joon, the former director of South Korea's National Intelligence Service, reports that Kim Jong Un himself said that cyber capabilities are just as important as nuclear power and that "cyber warfare, along with nuclear weapons and missiles, is an 'all-purpose sword' that guarantees our [North Korea's] military's capability to strike relentlessly." Other reports note that in May 2020, the North Koreans recruited at least 100 top-notch science and technology university graduates into its military forces to oversee tactical planning systems. Mirim College, dubbed the University of Automation, churns out approximately 100 hackers annually. Defectors have testified that its students learn to dismantle Microsoft Windows operating systems, build malicious computer viruses, and write code in a variety of programming languages. The focus on Windows may explain the infamous North Korean-led 2017 WannaCry ransomware cyberattack, which wrought havoc in more than 300,000 computers across 150 countries by exploiting vulnerabilities in the popular operating system.


To see the future more clearly, find your blind spots

There are multiple causes for the blind spots. One is a persistent state of denial, described in four parts by an emergency management professional after Hurricane Katrina: “One is, it won’t happen. Two is, if it does happen, it won’t happen to me. Three: If it does happen to me, it won’t be that bad. And four: If it happens to me and it’s bad, there’s nothing I can do to stop it anyway.” To this, I’m sure we can now add a fifth rationalization: “It won’t happen again.” Denial, however, has never been a successful strategy. An additional cause of blind spots is an overreliance on available data. Executives have benefited greatly from increased insights derived through analytics and other sophisticated methods of pattern recognition. The limitation of these tools, however, is that they can’t detect the “dog that didn’t bark,” a reference to a Sherlock Holmes case in which the crucial clue is not what happened but what did not. Leading is, in part, about bringing an organization into the future, and so executives should sharpen their thinking to include not only what they can see clearly but also what they can’t. A third cause is conditions that can tightly bind thinking.


Being Future Ready Is The Only Way To Survive In Data Science Field

There are three key skills for any data scientist– a stronghold on mathematics and statistics. Secondly, you need a programming language base for different tasks such as data processing, storage, etc. Lastly, domain knowledge. When you are working in a company, you must think about what value you are adding. Having acquired these skills next comes constant upgradation and upskilling. There is a sea of resources available online. For example, Coursera and EDx are good sources for theoretical introductions to a variety of topics. For a more practical approach, aspirants may check Datacamp and Udemy. I would also suggest using Kaggle, participating in hackathons, and undertaking internships to gain an edge. It is also important to think from the perspective of being ready for future challenges, given this field’s dynamic nature. It does get difficult to catch up with every new model or concept. I find it difficult too. What I tend to do is I try to look at the bigger picture, and once a tech starts picking pace, I spend time understanding it. The secret lies in following a broad macro trend, not just in DS but in complete tech space.


How to implement a DevOps toolchain

A good DevOps toolchain is a progression of different DevOps tools used to address a specific business challenge. Connected in a chain, they guarantee a profitable cycle between the front-end and back-end developers, quality analyzers, and customers. The goal is to automate development and deployment processes to ensure the rapid, reliable, and budget-friendly delivery of innovative solutions. We found out that building a successful DevOps toolchain is not a simple undertaking. It takes experimentation and nonstop refinement to guarantee that essential processes are fully automated. A DevOps toolchain automates all of the technical elements in your workflow. It also gets different teams on the same page so that you can focus on a business strategy to drive your organization into the future. We have come to identify five all the more valid benefits in support of the DevOps toolchain implementation. ... A fully enabled and properly implemented DevOps toolchain propels your innovation initiatives from start to end and ensures prompt deployment. Your toolchain will look different than this, depending on your requirements, but I hope seeing our workflow gives you a sense of how to approach automation as a solution.


3 Essential Steps to Exploit the Full Power of AI

A key to generating a good ROI is in executing data, automation, analytics and AI initiatives. Close to 23% of respondents have already set up or are in the process of setting up an AI Center of Excellence that shares and coordinates resources across different areas of the company. This number has risen from 18% just a year back. Also, nearly 19% of companies have a company-wide AI leader who oversees AI strategy and governance. The reason why such an integrated delivery model makes sense is the convergence of the cloud infrastructure that provides the storage and compute, the data that is the raw material for the analysis, the automation that operates on the technology infrastructure, the analytics that operates on the data to generate better insights, and the AI that enhances both the automation and the analytics resulted in decreased costs and better revenues. In large (greater than $1 billion revenues) companies the existing data and analytics group have expanded their remit to include AI. Companies that currently have separate centers of excellence (COE) for analytics and/or automation and/or AI must integrate, or the very least, coordinate their initiatives. Doing so would provide more seamless integration and yield better ROI. Companies that are just starting their journey in analytics and AI can start with an analytics or automation COE that expands to include AI capabilities.



Quote for the day:

"Our expectation in ourselves must be higher than our expectation in others." -- Victor Manuel Rivera

Daily Tech Digest - January 21, 2021

15 SLA mistakes IT leaders still make

SLAs have often been a point of contention ­— not only between providers and customers, but within organizations themselves. “It often boils down to IT leaders hating to read legal agreements while procurement and legal teams can be focused on business and financial risk rather than IT dependencies or the impact of system outages to delivering services,” says Joel Martin, cloud strategies research vice president at HFS Research. And as companies move more solutions to the cloud, understanding the service levels agreed to is important to developing trusted and dependable relationships. Moreover, SLA development and management has evolved significantly in recent years, with an eye toward driving business value. “Service recipients have become far more sophisticated in how they manage SLAs,” says Marc Tanowitz, managing director with West Monroe, adding that they “are looking for end-to-end outcomes that drive business success and recognize that the true value of SLAs is to drive business insights and performance — rather than to reduce the cost of service by capturing performance credits.” Nonetheless, there remain some common — and potentially costly — SLA mistakes IT leaders can make. Following are some of the most detrimental to the IT organization and the business at large.


Ransomware provides the perfect cover

Attackers are constantly creating new variants that evade detection by traditional signature-based approaches. To counteract these attacks, firms need to have defence in depth. This starts with preventing threat actors from infiltrating the network by defending against tactics such as phishing and malware campaigns through staff training, the use of strong passwords, 2FA, and patch management. If a threat actor makes it onto the system, their potential for lateral movement is limited when organizations have deployed a least-privilege approach, where access to files and folders is limited based on job role or seniority. Behavioral anomalies are a prime indicator that a threat actor could be on the network. This includes encrypting or downloading large amounts of data or user accounts trying to access restricted data. Successfully spotting such behaviour requires correlating data from many sources, including endpoint and network detection and response solutions. Finally, to ensure they can recover quickly in the event of a ransomware attack, organizations must also have robust backups that they can rely on if their network does go down.


Cisco tags critical security holes in SD-WAN software

The first critical problem–with a Common Vulnerability Scoring System rating of 9.9 out of 10–is vulnerability in the web-based management interface of Cisco SD-WAN vManage Software.  “This vulnerability is due to improper input validation of user-supplied input to the device template configuration,” Cisco stated. “An attacker could exploit this vulnerability by submitting crafted input to the device template configuration. A successful exploit could allow the attacker to gain root-level access to the affected system.” This vulnerability affects only the Cisco SD-WAN vManage product, the company stated. The second critical Cisco SD-WAN Software issue–with a CVSS rating of 9.8—could let an unauthenticated, remote attacker to cause a buffer overflow. “The vulnerability is due to incorrect handling of IP traffic,” Cisco stated. “An attacker could exploit this vulnerability by sending crafted IP traffic through an affected device, which may cause a buffer overflow when the traffic is processed. A successful exploit could allow the attacker to execute arbitrary code on the underlying operating system with root privileges.”


Microsoft Releases New Info on SolarWinds Attack Chain

According to Microsoft, the attackers achieved this by using a known MITRE attack method called event triggered execution, where malicious code is executed on a host system when a specific process is launched. In this case, the threat actors used the SolarWinds process to create a so-called Image File Execution Options (IEFO) registry value for running the malicious VBScript file when the dllhost dot exe process is executed on the infected system. The dllhost dot exe process is a legitimate Windows process for launching other applications and systems. When triggered, the VBScript then runs another executable that activates the Cobalt Strike DLL in a process that is completely disconnected and separate from the SolarWinds process. The VBScript then also deletes the IEFO registry value and other traces of the sequence of events that happened, according to Microsoft. The full motives behind the operation and its victims remain unclear — or at least publicly undisclosed — though some believe it may have been for corporate espionage or spying. FireEye, Microsoft, the US Cybersecurity and Infrastructure Security Agency (CISA), and numerous others have described the operation as being the work of a highly sophisticated state-backed actor. 


Accessible 5G: Making it a reality

To make 5G truly accessible to businesses, customers and consumers, we need to improve connectivity for all by eventually converging cellular and satellite networks to provide coverage both on land and via geo-satellite. While 3G and 4G were primarily created to improve mobile services for mobile device users, 5G is expected to support a much wider scope of IoT applications. With more intelligence being packed into smart, connected devices – we’ll need seamless connectivity and coverage. The hybrid network will enable all types of industries, from education and healthcare to construction and manufacturing, to not only use IoT technology to improve services and efficiencies but remove operational complexities, such as in-building coverage for more remote locations and black spots in connectivity when laying foundations – think basement renovations and housing developments in remote landscapes. As 5G-enabled smart devices and IoT applications increase, so too will the volume of data transactions between devices in the home: Smartphones, tablets, TVs, voice-assistance, and white goods like refrigerators and smart ovens. The sheer volume of applications transferring data to communicate with each other, for example, using voice assistance to dim the lights and select a film to watch for a night in, will require robust and seamless connectivity for the perfect experience.


Fueled by Record Profits, Ransomware Persists in New Year

In 2020, exfiltrating data from victims before crypto-locking their systems and naming and shaming victims via leaks sites became common. Pioneered by the now-defunct Maze group in late 2019, many other groups followed suit. Those include Clop, DoppelPaymer, Nefilim, Sekhmet and, more recently, Avaddon. DoppelPaymer was also tied to an attack against a hospital in Germany, which led to a seriously ill patient having to be rerouted to another hospital. "This individual later died, though German authorities ultimately did not hold the ransomware actors responsible because the German authorities felt the individual's health was poor and the patient likely would have died even if they had not been re-routed," the FBI notes in a private industry alert issued last month. For exfiltrating data, "size doesn't matter" for attackers, Sophos says. "They don't seem to care about the amount of data targeted for exfiltration. Directory structures are unique to each business, and some file types can be compressed better than others. We have seen as little as 5GB, and as much as 400GB, of compressed data being stolen from a victim prior to deployment of the ransomware." 


The state of the dark web: Insights from the underground

According to Raveed Laeb, product manager at KELA, the dark web of today represents a wide variety of goods and services. Although traditionally concentrated in forums, dark web communications and transactions have moved to different mediums including IM platforms, automated shops, and closed communities. Threat actors are sharing covert intelligence on compromised networks, stolen data, leaked databases and other monetizable cybercrime products through these mediums. “The market shifts are focused on automation and servitization [subscription models], aimed at aiding the cybercrime business to grow at scale,” says Laeb. “As can be witnessed by the exponential rise of ransomware attacks leveraging the underground financial ecosystem, the cybercriminal-to-cybercriminal markets allow actors to seamlessly create a supply chain that supports decentralized and effective cybercrime intrusions—giving attackers an inherent edge.” ... “Defenders can exploit these robust and dynamic ecosystems by gaining visibility into the inner workings of the underground ecosystem—allowing them to trace the same vulnerabilities, exposures, and compromises that would be leveraged by threat actors and remediate them before they get exploited,” says Laeb.


New MIT Social Intelligence Algorithm Helps Build Machines That Better Understand Human Goals

While there’s been considerable work on inferring the goals and desires of agents, much of this work has assumed that agents act optimally to achieve their goals. However, the team was particularly inspired by a common way of human planning that’s largely sub-optimal: not to plan everything out in advance, but rather to form only partial plans, execute them, and then plan again from there. While this can lead to mistakes from not thinking enough “ahead of time,” it also reduces the cognitive load.  For example, imagine you’re watching your friend prepare food, and you would like to help by figuring out what they’re cooking. You guess the next few steps your friend might take: maybe preheating the oven, then making dough for an apple pie. You then “keep” only the partial plans that remain consistent with what your friend actually does, and then you repeat the process by planning ahead just a few steps from there.  Once you’ve seen your friend make the dough, you can restrict the possibilities only to baked goods, and guess that they might slice apples next, or get some pecans for a pie mix. Eventually, you’ll have eliminated all the plans for dishes that your friend couldn’t possibly be making, keeping only the possible plans (i.e., pie recipes). Once you’re sure enough which dish it is, you can offer to help.


5G: Opportunities and Challenges for Electric Distribution Companies

While the primary focus for this new technology from a common carrier’s perspective seems to center around broadband services, the most likely areas that will be important to electric utilities will be the increased capacity to support field area network needs for connected grid devices. The "Grid of Things" will greatly benefit from the connectedness afforded by the larger IoT. "We plan to leverage our AMI network for connectivity needs, but that may change as we deploy more 'grid-edge' devices," said an executive of a mid-sized mid-Atlantic utility. Low-latency services potentially offer the opportunity to leverage this technology to support mission critical applications, such as protective relay management, SCADA, and substation communications. "Use of 5G can potentially provide SCADA and other system data over a cellular network versus a hard-wired solution through fiber or copper," said a general manager of a Connecticut public utility. The high data rate mmWave wireless broadband services may be applied to augmented/virtual reality (AR/VR), an area where some utilities like Duke Energy and EPRI are actively leveraging, and unmanned aerial vehicles (UAVs) that will improve asset management and visualization.


Financial institutions can strengthen cybersecurity with SWIFT’s CSCF v2021

SWIFT created the CSP to support financial institutions in protecting their own environments against cybercrime. The CSP established a common set of security controls, the Customer Security Controls Framework (CSCF), designed to help users secure their systems with a list of mandatory controls, community-wide information sharing initiatives, and security features on their payment infrastructure. The CSCF is designed to evolve based on threats observed across the transaction landscape. The CSCF’s controls are centered around three overarching objectives: Secure your environment; Know and limit access; and Detect and respond. The updated CSCF v2021 includes changes to existing controls and additional guidance and clarification on implementation guidelines. The newest version includes 31 security controls, 22 mandatory controls, and 9 advisory controls. Mandatory controls must be implemented by all users on the user’s local SWIFT infrastructure. Advisory controls are based on recommended best practices advised by SWIFT.



Quote for the day:

"Education is what survives when what has been learned has been forgotten." -- B. F. Skinner

Daily Tech Digest - January 20, 2021

New Intel CPU-level threat detection capabilities target ransomware

Detecting ransomware programs has never been easy, and attackers have always found ways to evade security products. The sophisticated groups that use manual hacking and perform months-long reconnaissance and lateral movement inside corporate networks will know very well what malware detection software their victims are using and can test in advance to make sure their payload will not be detected. This is part of the reason why ransomware campaigns are so effective and devastating to organizations. Aside from signature-based detection, security products attempt to detect ransomware-like behavior by monitoring for unusual patterns in file activity. For example, the reading and writing of a large number of files in certain directories or with certain file types in rapid succession can indicate suspicious activity. Significant differences in the contents of overwritten files is another example since an encrypted file will look totally different than the original file. Attempts to delete Volume Shadow Copy Service (VSS) backups can also be indicative of ransomware. All these signals together can be used to detect ransomware, but attackers can still try to hide, for example, by slowing down file encryption and executing it in batches.


Streaming Data From Files Into Multi-Broker Kafka Clusters

Kafka Connect is a tool for streaming data between Apache Kafka and other external systems and the FileSource Connector is one of the connectors to stream data from files and FileSink connector to sink the data from the topic to another file. Similarly, numerous types of connector are available like Kafka Connect JDBC Source connector that imports data from any relational database with a JDBC driver into an Apache Kafka topic. Confluent.io developed numerous connectors for import and export data into Kafka from various sources like HDFS, Amazon S3, Google cloud storage, etc. Connectors belong to commercial as well as Confluent Community License. Please click here to know more about the Confluent Kafka connector. File Source connector to stream or export data into Kafka topic and File Sink connector to import or sink data from the Kafka topic to another file. The file that receives the data continuously from the topic can be considered as a consumer. These two connectors are part of the Open Source Apache Kafka ecosystem and do not require any separate installation.


Legacy security architectures threaten to disrupt remote working

Connecting users often came at the expense of other factors, such as security, performance and management. As most respondents (81%) expect to continue working from home (WFH), 2021 will see enterprises address those other areas, evolving their remote access architectures to protect the remote workforce without compromising on the user experience. Yet securing the remote workforce has proved challenging for IT professionals. Enforcing corporate security policies on remote users was the second most common security challenge (58% of respondents) while 57% indicated they lacked the time and resources to implement recognised security best practices. Boosting remote access performance was found to be the most popular use case for 2021, by 47% of respondents. SASE was also an increasing focus for enterprises in post-pandemic 2021, with as many as 91% of respondents expecting SASE to simplify management and security. Half of respondents (52%) said SASE would be very or extremely important to their businesses post-Covid-19 and 91% of respondents expected SASE to simplify management and security. Providing evidence of how SASE is benefiting organisations, Cato found that of those firms that had already adopted SASE, 86% experienced increased security, 70% indicated time savings in management and maintenance...


Companies turning to MSPs as attack vectors get more sophisticated

Security is not the only top driver. Finance leaders chose reduced costs (57%) as their top reason, noting that an MSP is less expensive than hiring talent internally. For e-commerce retailers, increased security (46%) and reduced costs (46%) tied for the top spot. “It’s never been more critical to have an encrypted backup and disaster recovery solution to ensure your business is always up and running. The increased threats to companies and MSPs have never been this severe, and it’s going to continue to get worse,” said Infrascale CEO Russell P. Reeder. “In this ever more challenging landscape, data protection and data recovery are top priorities for MSPs serving clients, especially as attack surfaces expand and attack vectors get more sophisticated,” he continued. The survey further revealed which MSP services are most prominent for each industry. Finance (53%), education (51%), and healthcare (53%) executives all noted that the top service they leverage most with their MSPs is data protection, while manufacturing executives specified a subset of that category, cybersecurity services (58%) — focusing on computer network environments as their top MSP service.


Why CIOs Must Set the Rules for No-Code, Low-Code, Full-Code

A no-code application uses point-and-click visual tools that users drag and drop in order to create an application. No knowledge of coding is needed. This is strictly point-and-click development on a visual user interface that gives access to data, basic logic and data display choices. Best fit: No-code development works when the data and queries the user needs are basic and the tool can integrate with the data sources that have predefined APIs. No-code tools are ideal for rapid turnaround applications that use and report basic information -- like, what are the sales numbers for our air conditioning products this month? The tools are used with transactional data, not with unstructured, big data. Low-code development tools have point-and-click, graphical user interfaces that are similar to those found in no-code tools, only low code also allows developers to add pieces of custom code that embellish functions not handled by the low-code platform. Best fit: For applications that must be integrated with other systems and databases, as well as delivering rapid time to market, low-code tools make excellent platforms. Low code also enables non-programming users to collaborate in developing apps with more technical IT programmers.


Tips for a Bulletproof War Room Strategy

In today's environment, especially in larger companies, employee skill sets are getting more technically diverse with stand-alone teams spanning cloud, network, development, automation, and more. As much as these teams may want to work in their own lane, there is no denying that their work directly affects other groups in the organization. When they send updates or find an exploit that threatens their system, it's not just their system that is impacted. It can produce massive consequences across all areas of the business. ... In combat, one of the biggest mistakes that could cause you to lose your position is indecision. In security, when a breach occurs, teams can't afford to disagree. War rooms are built to enable quick decision-making by empowering need-to-know decision-makers with the authority needed to respond rapidly. An effective war room brings together the right people and the right information so that the right decisions can be quickly made. ... In another, you can elevate that war room into an actual live incident or bring together a group of senior management to plan out the risk posture for the foreseeable future, whether that's the next quarter, the next year, or maybe for a large upcoming event where they want to plan for attack possibilities.


Microsoft Taking Additional Steps to Address Zerologon Flaw

Some security experts say Microsoft is taking the right step to ensure that customers' networks remain safe even if they haven't applied the patch. "Microsoft seems to expect that patching all devices out there will take a substantial amount of time, so it takes this backup approach to mitigate the risk for its customers," says Dirk Schrader, global vice president at security firm New Net Technologies. "The difficulty for those customers, given the pandemic situation of working from home, is to find and patch all vulnerable devices. It is time to scan and check all devices, monitor them for unwanted changes, to find and patch as quickly as possible." Jigar Shah, vice president of security firm Valtix, notes that Active Directory remains important to companies that rely on cloud platforms, such as Azure. So, they want to be assured that their infrastructure is secure even if that requires Microsoft to force the issue. "Active Directory domain controllers are still fundamental to enterprise apps in public clouds," Shah says. "And the battle is to continuously and automatically do virtual patching until software vendors roll out patches that can be deployed, something that often takes weeks and months..."


Study: Cloud transformation necessary for digital transformation

Cloud migration is a necessary step for digital transformation, which is proceeding faster than planned at many enterprises because of the COVID-19 pandemic, according to research from Cloud Industry Forum (CIF), a cloud computing organization based in the United Kingdom. The cloud is an important steppingstone for getting off legacy on-prem technologies and outfitting today's more flexible, remote workforce. Supporting a remote workforce requires a digital transformation, and to do that, companies need the cloud – public, private, or hybrid. CIF found that in many sectors, remaining productive during lockdown depended on their cloud-readiness. Migrating to the cloud has delivered results for more than 90% of organizations during the past year, according to the CIF research. In addition, 91% of decision makers said that cloud formed an important part of their digital transformation, with 40% saying the role of the cloud was crucial. COVID-19 has been a significant driver. A majority of organizations (69%) have sped up their as digital transformation plans in some way as a result of the pandemic, according to the research. "On the whole, organizations did a commendable job of adapting in the face of an unprecedented situation; it is safe to say that many have been pleasantly surprised at how successful the shift to remote working has been. 


Digital Transformation: How Leaders Can Stand Out

Enterprise CIOs are contending with the impact of COVID-19 on their IT priorities and tech spending. In order to prioritize what is indispensable, there should be a strong focus on embracing technology that puts the bottom line first. There’s a huge opportunity to streamline repetitive, time-consuming tasks across departments, from marketing to sales and customer service, freeing up time and shortening feedback loops. Traditional digital transformation initiatives often overlook the edges of the business where employees are stuck relying on manual processes, spreadsheet solutions and outdated legacy systems for business workflows. Organizations have to be able to solve for changes quickly, whenever they may come up, from anywhere in the business. Having digital tools in place that allow for automation and enhanced processes are crucial not only for saving time and money, but also for providing real-time insights and opportunities to change to quickly adapt to meet customer demands, employees and overall disruption. The shortage of software developer talent is well-documented, and IT departments are overwhelmed without the support they desperately need.


2021 Trends in Blockchain: Mainstream Adoption at Last

The most emergent Blockchain trend of the year is the motion towards solving its scalability issues via the cloud. There are plentiful cryptocurrency use cases in which the notion of scale—both horizontal and vertical, reflecting mounting numbers of users and data—induces considerable latency, almost derailing this technology’s value. A practical solution to this necessity stemming from blockchain’s decentralized consensus approach to transaction validation is employing serverless computing architecture to resolve the latency resulting from the conventional approach, in which “every machine is doing the same work,” Wagner revealed. “If one runs out of space, memory, compute, or network capacity, game over.” However, by relying on serverless architecture to spin up machines on demand, “that serverless implementation lets us recruit hundreds, thousands, even tens of thousands of machines for every individual node of a blockchain,” Wagner explained. This method enables organizations to devote whatever resources they need to validate transactions with these decentralized ledgers, dramatically reducing the latency and downtime otherwise inherent to scaling up.



Quote for the day:

"Make every detail perfect and limit the number of details to perfect." -- Jack Dorsey