Showing posts with label remote working. Show all posts
Showing posts with label remote working. Show all posts

Daily Tech Digest - September 14, 2024

Three Critical Factors for a Successful Digital Transformation Strategy

Just as important as the front-end experience are the back-end operations that keep and build the customer relationship. Value-added digital services that deliver back-end operational excellence can improve the customer experience through better customer service, improved security and more. Emerging tech like artificial intelligence can substantially improve how companies get a clearer view into their operations and customer base. Take data flow and management, for example. Many executives report they are swimming in information, yet around half admit they struggle analyzing it, according to research by Paynearme. While data is important, the insights derived from that data are key to the conclusions executives must draw. Maintaining a digital record of customer information, transaction history, spend behaviors and other metrics and applying AI to analyze and inform decisions can help companies provide better service and protect their end users. They can streamline customer service, for instance, by immediately sourcing relevant information and delivering a resolution in near-real time, or by automating the analysis of spend behavior and location data to shut down potential fraudsters.


AI reshaping the management of remote workforce

In a remote work setting, one of the biggest challenges for organizations remains in streamlining of operations. For a scattered team, the implementation of AI emerges as a revolutionary tool in automating shift and rostering using historical pattern analytics. Historical data on staff availability, productivity, and work patterns enable organizations to optimise schedules and strike a perfect balance between operational needs and employee preferences. Subsequently, this reduces conflicts and enhances overall work efficiency. Apart from this, AI analyses staff work duration and shifts that further enable organizations to predict staffing needs and optimise resource allocation. This enhances capacity modelling to ensure the right team member is available to handle tasks during peak times, preventing overstaffing or understaffing issues. ... With expanding use cases, AI-powered facial recognition technology has become a critical part of identity verification and promoting security in remote work settings. Organisations need to ensure security and confidentiality at all stages of their work. In tandem, AI-powered facial recognition ensures that only authorized personnel have access to the company’s sensitive systems and data. 


The DPDP act: Navigating digital compliance under India’s new regulatory landscape

Adapting to the DPDPA will require tailored approaches, as different sectors face unique challenges based on their data handling practices, customer bases, and geographical scope. However, some fundamental strategies can help businesses effectively navigate this new regulatory landscape. First, conducting a comprehensive data audit is essential. Businesses need to understand what data they collect, where it is stored, and who has access to it. Mapping out data flows allows organizations to identify risks and address them proactively, laying the groundwork for robust compliance. Appointing a Data Protection Officer (DPO) is another critical step. The DPO will be responsible for overseeing compliance efforts, serving as the primary point of contact for regulatory bodies, and handling data subject requests. While it’s not yet established whether it’s mandatory or not, it is safe to say that this role is vital for embedding a culture of data privacy within the organisation. Technology can also play a significant role in ensuring compliance. Tools such as Unified Endpoint Management (UEM) solutions, encryption technologies, and data loss prevention (DLP) systems can help businesses monitor data flows, detect anomalies, and prevent unauthorized access. 


10 Things To Avoid in Domain-Driven Design (DDD)

To prevent potential issues, it is your responsibility to maintain a domain model that is uncomplicated and accurately reflects the domain. This diligent approach is important to focus on modeling the components of the domain that offer strategic importance and to streamline or exclude less critical elements. Remember, Domain-Driven Design (DDD) is primarily concerned with strategic design and not with needlessly complexifying the domain model with unnecessary intricacies. ... It's crucial to leverage Domain-Driven Design (DDD) to deeply analyze and concentrate on the domain's most vital and influential parts. Identify the aspects that deliver the highest value to the business and ensure that your modeling efforts are closely aligned with the business's overarching priorities and strategic objectives. Actively collaborating with key business stakeholders is essential to gain a comprehensive understanding of what holds the greatest value to them and subsequently prioritize these areas in your modeling endeavors. This approach will optimally reflect the business's critical needs and contribute to the successful realization of strategic goals.


How to Build a Data Governance Program in 90 Days

With a new data-friendly CIO at the helm, Hidalgo was able to assemble the right team for the job and, at the same time, create an environment of maximum engagement with data culture. She assembled discussion teams and even a data book club that read and reviewed the latest data governance literature. In turn, that team assembled its own data governance website as a platform not just for sharing ideas but also to spread the momentum. “We kept the juices flowing, kept the excitement,” Hidalgo recalled. “And then with our data governance office and steering committee, we engaged with all departments, we have people from HR, compliance, legal product, everywhere – to make sure that everyone is represented.” ... After choosing a technology platform in May, Hidalgo began the most arduous part of the process: preparation for a “jumpstart” campaign that would kick off in July. Hidalgo and her team began to catalog existing data one subset of data at a time – 20 KPIs or so – and complete its business glossary terms. Most importantly, Hidalgo had all along been building bridges between Shaw’s IT team, data governance crew, and business leadership to the degree that when the jumpstart was completed – on time – the entire business saw the immense value-add of the data governance that had been built.


Varied Cognitive Training Boosts Learning and Memory

The researchers observed that varied practice, not repetition, primed older adults to learn a new working memory task. Their findings, which appear in the journal Intelligence, propose diverse cognitive training as a promising whetstone for maintaining mental sharpness as we age. “People often think that the best way to get better at something is to simply practice it over and over again, but robust skill learning is actually supported by variation in practice,” said lead investigator Elizabeth A. L. Stine-Morrow ... The researchers narrowed their focus to working memory, or the cognitive ability to hold one thing in mind while doing something else. “We chose working memory because it is a core ability needed to engage with reality and construct knowledge,” Stine-Morrow said. “It underpins language comprehension, reasoning, problem-solving and many sorts of everyday cognition.” Because working memory often declines with aging, Stine-Morrow and her colleagues recruited 90 Champaign-Urbana locals aged 60-87. At the beginning and end of the study, researchers assessed the participants’ working memory by measuring each person’s reading span: their capacity to remember information while reading something unrelated.


Why Cloud Migrations Fail

One stumbling block on the cloud journey is misunderstanding or confusion around the shared responsibility model. This framework delineates the security obligations of cloud service providers, or CSPs, and customers. The model necessitates a clear understanding of end-user obligations and highlights the need for collaboration and diligence. Broad assumptions about the level of security oversight provided by the CSP can lead to security/data breaches that the U.S. National Security Agency (NSA) notes “likely occur more frequently than reported.” It’s also worth noting that 82% of breaches in 2023 involved cloud data. The confusion is often magnified in cases of a cloud “lift-and-shift,” a method where business-as-usual operations, architectures and practices are simply pushed into the cloud without adaptation to their new environment. In these cases, organizations may be slow to implement proper procedures, monitoring and personnel to match the security limitations of their new cloud environment. While the level of embedded security can differ depending on the selected cloud model, the customer must often enact strict security and identity and access management (IAM) controls to secure their environment.


AI - peril or promise?

The interplay between AI data centers and resource usage necessitates innovative approaches to mitigate environmental impacts. Advances in cooling technology, such as liquid immersion cooling and the use of recycled water, offer potential solutions. Furthermore, utilizing recycled or non-potable water for cooling can alleviate the pressure on freshwater resources. Moreover, AI itself can be leveraged to enhance the efficiency of data centers. AI algorithms can optimize energy use by predicting cooling needs, managing workloads more efficiently, and reducing idle times for servers. Predictive maintenance powered by AI can also prevent equipment failures, thereby reducing the need for excessive cooling. This is good news as the sector continues to use AI to benefit from greater efficiencies, cost savings, driving improvements in services with the expected impact of AI on the operational side for data centres expected to be very positive. Over 65 percent of our survey respondents reported that their organizations are regularly using generative AI, nearly double the percentage from their 2023 survey and around 90 percent of respondents expect their data centers to be more efficient as a direct result of AI applications.


HP Chief Architect Recalibrates Expectations Of Practical Quantum Computing’s Arrival From Generations To Within A Decade

Hewlett Packard Labs is now adopting a holistic co-design approach, partnering with other organizations developing various qubits and quantum software. The aim is to simulate quantum systems to solve real-world problems in solid-state physics, exotic condensed matter physics, quantum chemistry, and industrial applications. “What is it like to actually deliver the optimization we’ve been promised with quantum for quite some time, and achieve that on an industrial scale?” Bresniker posed. “That’s really what we’ve been devoting ourselves to—beginning to answer those questions of where and when quantum can make a real impact.” One of the initial challenges the team tackled was modeling benzine, an exotic chemical derived from the benzene ring. “When we initially tackled this problem with our co-design partners, the solution required 100 million qubits for 5,000 years—that’s a lot of time and qubits,” Bresniker told Frontier Enterprise. Considering current quantum capabilities are in the tens or hundreds of qubits, this was an impractical solution. By employing error correction codes and simulation methodologies, the team significantly reduced the computational requirements.


New AI reporting regulations

At its core, the new proposal requires developers and cloud service providers to fulfill reporting requirements aimed at ensuring the safety and cybersecurity resilience of AI technologies. This necessitates the disclosure of detailed information about AI models and the platforms on which they operate. One of the proposal’s key components is cybersecurity. Enterprises must now demonstrate robust security protocols and engage in what’s known as “red-teaming”—simulated attacks designed to identify and address vulnerabilities. This practice is rooted in longstanding cybersecurity practices, but it does introduce new layers of complexity and cost for cloud users. Based on the negative impact of red-teaming on enterprises, I suspect it may be challenged in the courts. The regulation does increase focus on security testing and compliance. The objective is to ensure that AI systems can withstand cyberthreats and protect data. However, this is not cheap. Achieving this result requires investments in advanced security tools and expertise, typically stretching budgets and resources. My “back of the napkin” calculations figure about 10% of the system’s total cost.



Quote for the day:

"Your greatest area of leadership often comes out of your greatest area of pain and weakness." -- Wayde Goodall

Daily Tech Digest - November 23, 2022

What's coming for cloud computing in 2023

Enterprises often move to multicloud on purpose, but way more often multicloud just happens as enterprises strive to find and leverage best-of-breed cloud services with no plan for what to do with those services after deployment. This leads to too much cost and not enough return of value to the business. Old story. This cloud complexity problem can be solved through the strategic use of technology and better approaches to manage the complexity. Most important is reducing redundancy by using a common layer of technology above the public cloud providers as well as above any legacy or edge-based systems. This layer includes common services, such as a single security system, a single data management system, finops, a single cloud operations system, etc. We’re not attempting to solve every problem within the “walled garden” of each public cloud provider; this technology should exist within a common layer, aka supercloud or metacloud. This strategic cloud trend not only solves the complexity problems by leveraging common services and a common control plane, it also helps get cloud costs under control through a common finops layer that handles cost monitoring, cost governance, and cloud cost optimization.


Best practices for implementing a company-wide risk analysis program

The first step is determining what is critical to protect. Unlike accounting assets (e.g., servers, laptops, etc.), in cybersecurity terms this would include things that are typically of broader business value. Often the quickest path is to talk with the leads for different departments. You need to understand what data is critical to the functioning of each group, what information they hold that would be valuable to competitors and what information disclosures would hurt customer relationships. Also assess whether each department handles trade secrets, or holds patents, trademarks, and copyrights. Finally, assess who handles personally identifiable information (PII) and whether the group and its data are subject to regulatory requirements such as GDPR, PCI DSS, CCPA, Sarbanes Oxley, etc. When making these assessments, keep three factors in mind: what needs to be safe and can’t be stolen, what must remain accessible for continued function of a given department or the organization, and what data/information must be reliable (i.e., that which can’t be altered without your knowledge) for people to do their jobs.


What Is Data Virtualization?

The process of data virtualization is quite simple. Data is accessed in its original form and source. Unlike typical “extract, transform, and load” (ETL) processes, virtualization doesn’t require data to be moved to a data warehouse or data lake first. Data is aggregated in a single location, known as a virtual data layer. Using this layer, enterprises can develop simple, holistic, and customizable views (also known as dashboards) for accessing and making sense of data. Using these tools, users can also pull real-time reports, manipulate data, and perform advanced data processes such as predictive maintenance. Data is easily accessible via dashboards from anywhere. ... While data is critical to the decision-making process, not just any data will do. The data used must be accurate, up-to-date, and logical. It must also be displayed in a way that all stakeholders can understand, whether a user is a data scientist or a C-level executive. Data virtualization enables stakeholders to access the specific data they need when they need it. Because data isn’t just a replication from any given time, all data is accurate to the minute. 


LockBit 3.0 Says It's Holding a Canadian City for Ransom

LockBit operators posted screenshots showing files of different departments and other data as a proof for their claim, but Information Security Media Group was unable to immediately contact the municipality and confirm the authenticity of the documents. The attack comes on the heels of a new National Cyber Threat Assessment 2023-2024 by the Canadian Center for Cyber Security. The report, which says ransomware is "the most disruptive form of cybercrime facing Canadians," adds that ransomware benefits significantly from the specialized cybercrime economy and the growing availability of stolen information. "So long as ransomware remains profitable, we will almost certainly continue to see cybercriminals deploying it," the report says. The city of Westmount's official website was not affected by the attack, and the municipality says any updates on the recovery will be communicated on the site. The mayor assured residents that data security is its "top priority" and so "is the protection of our residents' and employees' information."


A brief history of industrial IoT

Most early networking technologies were wired: Connection required cables that physically linked your device to the network. Network bandwidth — the amount of data that can be conveyed in a period of time — for 10BASE-T Ethernet connections, one of the most widely used standards established in the late 1980s and early 1990s, allowed for as much as 10 Megabits of data per second. In contemporary times, wired networks support connections of 1,000 Megabits of data per second (1000BASE-T or 1 Gigabit) or even 10 Gigabits of data per second (10GBASE-T) for modern Ethernet connections. Wireless and cellular networking, which eliminated the need for a cable to each device, was a significant shift for IIoT. Standardized in 1999, 802.11b was one of the first standards supported in products from many manufacturers and was a predecessor to the Wi-Fi 6E standard established in 2020. Modern Wi-Fi devices not only offer speeds anywhere from 50 to 800 times as fast as earlier equipment, but the devices may also perform reliably in much more dense radio environments than their predecessors.


How to Avoid Risks Before Implementing Industrial IoT Solutions

Industrial IoT solutions are often implemented at Enterprises with a high proportion of machine manufacturing. For a well-funded company, it is often easier to implement the IoT ecosystem using modern equipment. But for some, it would be too expensive to replace legacy manufacturing systems. Therefore, companies often choose to adapt existing equipment and enhance it with sensors, smart devices, and gateways. However, when choosing to implement IoT technology in an enterprise equipped with old machines, the company has to ensure protocols are understandable for all the devices to connect disparate data stores, and solve all the compatibility issues. According to McKinsey, a company moving to EIoT has to solve compatibility issues for about 50% of all devices. If compatibility issues are not solved appropriately, the solution may not function as intended, or even at all. The wrong algorithm or incorrect integration can lead to hardware malfunctions and equipment damages, overheating, explosion, or system failure. 


How remote working impacts security incident reporting

The risks of an impeded reporting process due to remote working are significant. When incidents go unreported, reports are delayed/miscommunicated or follow-up actions/responses are hindered, it can leave vulnerabilities exposed and/or buy attackers time in the system to infiltrate more of the network before the security team can detect and contain threats and malicious activity, Chavoya warns. This can not only exacerbate the severity of incidents and attacks but can also damage both the reputation of a business and its ability to meet certain data protection regulations which stipulate strict rules surrounding disclosure. These could lead to loss of customer confidence and large monetary penalties. It is therefore paramount for security teams to update their reporting policies and processes to account for the security implications of remote working. “The home and hybrid working trend is here to stay, so it is incredibly dangerous for security teams to rely on policies and processes designed for a bygone era when most, if not all, employees were based in a controlled office environment,” says Holyome. 


IT leadership: 5 ways to create a culture of gratitude

Expressing gratitude is an integral part of a healthy culture. I think it starts with a leader maintaining healthy personal humility and respect and empathy for their staff, so that gratitude is coming from a genuine place. Thank-yous should be prompt, specific, and connect the accomplishment to its impact on our mission of educating students. Thanking a team for finishing a project, as in: “Your team successfully implemented this project, which I really appreciate” is more powerful when it adds, “The new UI will help our students better determine what classes they still need to take in order to graduate.” It’s helpful to give customer feedback as well, such as “I talked with an adviser who says this will really help her more accurately advise students.” IT teams always see a steady stream of problem tickets, so hearing how their work is impacting students and faculty, and/or hearing verbatim feedback from delighted users, can be very encouraging. In addition to thanking employees individually, department emails and all-staff meetings and parties should all include recognition and gratitude for recent accomplishments, and a little free food and swag never hurts, either.


5 pitfalls to avoid when partnering with startups

For Bedi, it came as a rude shock when he found out a startup he was working with on a project didn’t have an internal development team and instead relied on a third party for its deliverables. “We had partnered with a startup on a customer onboarding project. A delay of 15 to 20 days is acceptable but alarm bells ring when there is a significant overrun of timelines. In our case, there was a delay of more than two months,” says Bedi. “Not only a lack of bandwidth but also the brief that the startup receives from the enterprise and passes to the third party gets lost in translation. It doesn’t help that the startup didn’t read the detailed business requirements document.” Unfortunately, it’s tough to cut this risk altogether, Bed says. “There are few IT leaders who verify the credentials of a startup to the extent of asking the CVs of their team members. Even if some do so, some startups resort to ‘body shopping,’” he says, referring to the practice of recruiting workers to contract their services out on a tactical short- to mid-term basis. So, what’s the way out? The best approach is to open a clear line of communication with the startup and ensure transparency. 


Implications of Emerging Technology on Cyber-Security

Proper understanding of the new technologies is very important; this includes risk assessment and evaluation of the new technology, followed by proper planning for implementation and risk mitigation. Risks are changing much faster than organisations can mitigate them. Unfortunately, there is no silver bullet for cyber-security, but there are three areas that must be carefully planned: Organizations must ensure they understand the risks of any new technology they install, as this will be key to properly securing it. As a result, training and education on the new technology is a cornerstone to build on, and this is not just for technology people but for everyone involved who works with critical data and new technologies. Although ultimate accountability will still rest with the organization’s senior management, the information security team has the responsibility to study the new technology well and evaluate the associated risks. The primary goal is to foster an organisational culture that encourages both risk-based decision making and innovation and new technology adoption.



Quote for the day:

"Leadership does not always wear the harness of compromise." -- Woodrow Wilson

Daily Tech Digest - January 13, 2022

Crafting an Agile Enterprise Architecture

The blueprint for a truly agile architecture requires fundamental shifts in the business dynamic. There are three essentials that stand out as requirements for attaining agility across the entire enterprise. ... The bedrock of successful enterprise-wide agility is collaboration. Innovation will flourish when it is decentralized, and isolated silos give way to cross-functional, agile and self-organizing teams. An isolated IT team leads to delayed projects, overrun budgets, productivity that is hard to measure and disconnects between business and operations. Every department must be involved in supporting and achieving key business goals. Teams containing a mix of business line and IT professionals accelerate development and delivery, greatly reducing time to market. Based upon a shared customer-centric goal and vision, there is shared ownership of outcomes and a deeper level of engagement throughout the enterprise. Daily communication and collaborative feedback nurtures creativity, problem-solving, and drive continuous integration and continuous development. 


The Next Evolution of the Database Sharding Architecture

Considering the new challenges databases are facing, is there an efficient and cost-effective way to leverage these types of databases and enhance them through some new practical ideas? Database transparent sharding is one of the best answers to this question. One of the best techniques for this is to split the data into separate rows and columns. This splitting of large database tables into multiple small tables are known as shards. The original table is divided into either vertical shards or horizontal shards. Terminologies used to label these tables can be subjective to ‘VS1’ for vertical shards and ‘HS1’ for flat shards. The number represents the first table or the first schema. Then 2 and 3, and so on. These subsets of data are referred to as the table's original schema. So what is the difference between sharding and partitioning? Both sharding and partitioning include breaking large data sets into smaller ones. But a key difference is that sharding implies that the breakdown of data is spread across multiple computers, either as horizontal or vertical partitioning. On the other hand, partitioning is when the database is broken down into different subsets but held within a single database, sometimes referred to as the database instance.


How to make your home office a more pleasant place to work

Eliminating your commute may actually have some negative impacts on your body, especially if your commute involved some amount of walking or biking. These days you could conceivably not leave your home for days on end, and being that sedentary really isn’t good for you. Get up and move, and get your heart pumping. You don’t need a fancy home gym. Get a yoga mat and watch some YouTube workouts that require only your body weight. Force yourself to go for walks, even when you don’t wanna. Stretch! ... Suddenly you have unfettered access to your fridge and snack cabinets, and it can be very tempting to just graze all day. So, what do you do? Here’s the strategy that has worked better for me than anything else: Fill your kitchen with healthy foods, and only healthy foods. Yep, really. If I wander into my kitchen, wanting a snack, and there are chips there, I’m going to eat those chips. But if I go there and the only snackable foods are carrots and sugar snaps, then that’s what I’m going to eat. Basically, I have to use my tendency toward slothfulness against my tendency for gluttony, and it really works!


‘Fully Undetected’ SysJoker Backdoor Malware Targets Windows, Linux & macOS

Once it finds a target, SysJoker masquerades as a system update, researchers said, to avoid suspicion. Meanwhile, it generates its C2 by decoding a string retrieved from a text file hosted on Google Drive. “During our analysis the C2 has changed three times, indicating the attacker is active and monitoring infected machines,” researchers noted in the report. “Based on victimology and malware’s behavior, we assess that SysJoker is after specific targets.” SysJoker’s behavior is similar for all three operating systems, according to Intezer, with the exception that the Windows version makes use of a first-stage dropper. After execution, SysJoker sleeps for a random amount of time, between a minute and a half and two minutes. Then, it will create the C:\ProgramData\SystemData\ directory and copy itself there using the file name “igfxCUIService.exe” – in other words, it masquerades as the Intel Graphics Common User Interface Service. After gathering system information (mac address, user name, physical media serial number and IP address), it collects the data into a temporary text file.


Who is going to Support your Next Mobile App Project? Hint: Not React Native or Flutter

React Native and Flutter are quality projects built by very capable and talented teams. The problem is that they are both incredibly complex and the massive surface area of each project has led to a huge volume of bug reports and other issues, and neither project offers dedicated support. For users of these projects, this complexity and large issue volume, combined with a lack of official support options, ultimately leads to a situation where there are very few options for getting help and support when there’s an issue. Google and Facebook are notorious for lacking a strong customer support culture, even for their paid products. Support is just not their most important priority. This tradeoff enables them to build services that reach mind-boggling levels of scale, but is at odds with what traditional teams and enterprises expect when it comes to vendors supporting their products. Culturally, Google and Facebook just don’t do support well and certainly not when it comes to open source or developer-focused products.


Meeting Patching-Related Compliance Requirements with TuxCare

TuxCare identified an urgent need to remove the business disruption element of patching. Our live kernel patching solution, first rolled out under the brand KernelCare, enables companies such as yours to patch even the most critical workloads without disruption. Instead of the patch, reboot, and hope that everything works routine, organizations that use the KernelCare service can rest assured that patching happens automatically and almost as soon as a patch is released. KernelCare addresses both compliance concerns and threat windows by providing live patching for the Linux Kernel within hours of a fix being available, thus reducing the exposure window and meeting or exceeding requirements in compliance standards. Timeframes around patching have consistently been shrinking in the past couple of decades, from many months to just 30 days to combat fast-moving threats – KernelCare narrows the timeframe to what's about as minimal a window as you could get.


How to achieve data interoperability in healthcare: tips from ITRex

Fast Healthcare Interoperability Resource (FHIR) was released in 2014 by HL7 as an alternative to HL7 v2. It relies on RESTful web services and open web technologies for communication, which can enhance interactions among legacy healthcare systems. Additionally, RESTful API provides a one-to-many interface, accelerating new data partners onboarding. FHIR’s interoperability merits are not limited to EHR and similar systems but extend to mobile devices and wearables. ... Digital Imaging and Communications in Medicine (DICOM) is a standard for communicating and managing medical images and related data. The National Electrical Manufacturer’s Association developed this standard. DICOM can integrate medical imaging devices produced by different manufacturers by providing a standardized image format. It allows healthcare practitioners to access and share DICOM-compliant images even if they are using different devices for image capturing. At ITRex, we had a large project involving the DICOM standard and medical imaging interoperability.


Enterprise Data: Prepare for More Change in This Hot Area of Tech

Enterprises using IoT can use embedded databases at the edge to copy aggregated sensor data to a back-end database when online. This brings the value of data directly to operations. At the same time, data from all the devices is being managed in the back-end database to develop analytics to advance the business. Artificial intelligence chips are taking center stage in these environments. AI chips refers to a new generation of microprocessors that are specifically designed to process artificial intelligence tasks faster and use less power. They are particularly good at dealing with artificial neural networks and are designed to do the machine learning model training and inference at the edge. We’ll also see the need for higher performance from edge computing hardware since better sensors and larger AI models now enable a host of new applications. There is a growing requisite to infer more data and then make decisions without sending data to the cloud. Also distributed sites can be linked together with an enterprise computing environment to create a unified computing environment.


How businesses overcome data saving and storing difficulties

First, companies that track data over time are able to understand trends and compare data points. With this information at hand, companies can start the analytical process of asking questions based on that knowledge. Then businesses can create new value from this data. Secondly, when companies begin collecting data, they are boosting their company’s transparency and transferability. This improves processes in any business. For example, a no-code dashboard can make data more objective and minimizes subjectivity. With the proper data tool, internal discussions will be centered more on the business objectives and goals. When having the right data at hand, it is easier to ask the right questions, such as how to improve sales after a product launch. Essentially, data eliminates the guessing. Both of these explanations of why storing data is important can be seen from a digital company or a physical one. Data is simply a way to understand your business better, no matter if it is big or small, analyze information, learn, improve and repeat the successes.


Securing your business in the hybrid workplace

Leaders have the chance now to reflect on what was learnt in the past two years and build on their company’s new digital foundations to create a secure, hybrid workplace fit for the post-pandemic economy. You should underpin your hybrid work goals with the rapid advances in technology now available to you, and to set up the foundations to accelerate growth – but simplicity will be key. For instance, with Microsoft 365 Business Premium, you can centrally configure, manage, and protect company-issued and employee’s personal devices accessing business information and services across Windows, Mac, Android or iOS. Simple features such as multi-factor authentication (MFA) can prevent 99 per cent of identity attacks by asking for additional evidence beyond the user’s password to grant access. Adding MFA for remote employees requires them to enter a security code received by a text, phone call or authentication app on their phone when they log into Microsoft 365. So, if a hacker gets hold of someone’s password through a phishing attack, they can’t use it to access sensitive company information.



Quote for the day:

"A belief is not merely an idea the mind possesses, it is an idea that possesses the mind." -- Robert Oxton Bolt

Daily Tech Digest - October 11, 2021

How businesses can combat data security and GDPR issues when working remotely

Whether using a business or personal device, having robust Secure Device Management and effective Mobile Device Management (MDM) is key to implementing security measures to keep data on mobile devices secure from threats. Adopting data encryption across software and devices being used remotely also allows data to be kept safe and secure from unauthorised use, even in the event of a security breach. In addition, implementing a corporate Virtual Private Network (VPN) enables an encrypted connection from a device to a network that allows the safe transmission of data from the office to remote working environments. Employees should have access only to the data they require to complete their work to mitigate against unnecessary risk of unauthorised access, with measures that restrict data on a ‘need-to-know’ basis implemented where possible. Crucially, companies should provide all employees working from home with a clear and documented remote working policy that outlines precisely how personal and company data should be handled to keep it secure.


Digital transformation: 4 excuses to leave behind

Outdated, manual, and siloed processes not only slow your business, but they boost costs because it is more expensive to maintain broken, outdated processes. As we emerge from the pandemic, most businesses are realizing that their existing business processes are not sustainable in the new normal. With remote and hybrid work becoming standard, organizations have had to think on their feet to maintain business as usual, and digital transformation makes this possible. COVID lockdowns made it urgent for enterprises to enable secure remote operations, which in turn made them realize the importance of migrating their operations to the cloud. There has been an exponential increase in the adoption of cloud technology post-pandemic. It has enabled businesses to operate in a remote environment without impacting the speed and quality of services. If you haven’t already done so, start by identifying the “low-hanging fruit” – i.e., processes that are best for your initial automation roadmaps. Then start scaling up. Transitioning to the cloud gives you countless possibilities, from reducing IT infrastructure costs to achieving scalability per business needs.


4 questions that get the answers you need from IT vendors

Enterprises don’t plan on how to adopt abstract technology concepts, they plan for product adoption and deployment. Network vendors who offer the products are the usual source of information, which can be delivered through news stories, vendor websites, or sales engagement. Enterprises expect the seller to explain why their product or service is the right idea, and sellers largely agree. It’s just a question of what specific sales process is supposed to provide that critical information. Technology salespeople, like all salespeople make their money largely on commissions. They call on prospects, pitch their products/services, and hopefully get the order. Their goal is a fast conversion from prospect to customer, and nearly all salespeople will tell you that they dread above all the “educational sell”. That happens when a prospect knows so little about the product/service being sold that they can’t make a decision at all and have to be taught the basics. The salesperson who’s teaching isn’t making commissions, and their company isn’t hitting their revenue goals.  


3 Things to Consider Before Investing in New Technology for Your Small Business

When you are searching for tech to suit your business's unique needs, it’s important to keep the happiness of your employees at the forefront. That’s what authentically attracts new talent to your company and entices people to stay. In many cases, happiness is derived from productivity. If workers know what they need to do but just don’t have the tools to do it quickly, they will get discouraged and customers will complain because they didn’t have a great experience. So, stop and assess why they’re experiencing each challenge as they move through tasks. Consider what you genuinely wish could be better or easier for you, your employees and everyone else involved. Then think about how technology may be able to solve each problem. If you equip a first-day employee with a mobile device that helps them get through a full inventory count comfortably and without making a single mistake, they are going to leave work feeling empowered. They’ll share their positive experience with friends, family and (if you’re lucky) social media. Word will spread about how great it is to work for your company.


Cloud Cost Optimization: A Pivotal Part of Cloud Strategy

To maintain an optimal state, you need to ensure that sound policies around budgeting are adhered to. In terms of Governance, the framework should oversee resource creation permissions as well. ... Once you gain visibility into spending metrics, you must observe which unused resources can be disposed of and which resources could be optimized. The journey for any cloud cost optimization starts with initial analyses of current cloud estate and identifying optimization opportunities across compute, network, storage, and other cloud-native features. Any cloud cost optimization framework needs to have a repository of cost levers with associated architecture and feature trade-offs. Businesses would need governance — the policies around budget adherence, resource creation permissions, etc. — to maintain an optimal state. A practical cost optimization framework requires all three of the above. Achieving initial savings would entail analyzing the estate and identifying optimization opportunities across compute, storage, and networking, focusing on the highest costs first and/or incremental/additional cost, month over month- cloud vendors provide access to the costs and utilization.


Applying Behavioral Psychology to Strengthen Your Incident Response Team

Orlando says it's natural for relationships to form, and for trust to form, in an incident response team and within a larger organization. In his experience, he often encounters what he calls the "rock star problem." "You've got one or a few people [who are] very, very capable, very knowledgeable, and the team sort of coalesces around those individuals," he says. "Which is not necessarily a bad thing, but it can create issues when those individuals inevitably move on, or maybe they [have] less than optimal work habits, or behaviors, or things we want to try to account for." Compounding CSIRTs' collaboration issues is a prominent focus on technical tools and skills, Orlando adds. Incident response teams are "often inundated" with tools to address technical problems in security and incident response; however, there is a "definite lack" of tools to address some of the social and collaboration challenges CSIRTs face in operating within the context of a multigroup, multiteam system as they need to do.


Netherlands Says Armed Forces May Combat Ransomware Attacks

Countries are being held accountable for their actions and inaction via diplomatic responses such as actions against cross-border criminal cyber operations and measures such as sanctions, which are more powerful if they are designed in a broad coalition context, Knapen says. "Within the EU, the Netherlands has therefore been a driving force behind the EU Cyber Diplomacy Toolbox and the adoption of the ninth EU cyber sanctions regime in May 2019, and the Netherlands is committed to further developing these instruments. This provides the EU with good tools to respond faster and more vigorously to cyber incidents. Recent EU statements and sanctions show that these instruments are delivering concrete results," he notes. Knapen is also pushing for diplomatic channels for bilateral cooperation between countries in judicial investigations against ransomware, which he says can be useful if cooperation through international judicial channels is insufficient. "The Netherlands can then emphasize the importance attached to cooperation through diplomatic channels," he says.


Can India Address the Growing Cybersecurity Challenges in the Nuclear Domain?

India has established several key agencies to counter the growing challenges on cybersecurity. However, the effectiveness of its cybersecurity policies in the nuclear domain lies with the ability to effectively incorporate cybersecurity, cyber infrastructure, and its operating agencies into the larger nuclear security framework. Efficient and effective cybersecurity mechanisms require cohesive inter-agency coordination to strengthen said mechanisms. It is also essential for government authorities to acknowledge, interact with, and evolve cybersecurity protocols and procedures regularly to reflect a rapidly changing security environment. An effective cybersecurity policy also requires clear demarcation of roles, responsibilities, and contingency plans for short and long-term implementation and altering based on circumstances and technological advancements. Additionally, and most importantly, a renewed emphasis on understanding cyber risks and acknowledging the importance of cyber-nuclear security is essential in the Indian context.


How technology can drive positive change in insurance post-COVID

From forced closures to operational transformation, the COVID-19 pandemic has impacted businesses both UK and worldwide. The world of insurance is no exception to this rule – but the nature of the industry and its interests have led to a layered set of challenges and opportunities beyond the obvious disruptions to working practices. These challenges have been laid out in a recent report from EY, which lists a number of early pandemic issues for the industry including the tricky transition to remote working, a “strong push toward digitisation”, and the embrace of virtual interactions for clients and distribution partners. While these concerns may feel familiar, EY’s report goes on to draw out the specific difficulties faced by insurers, where COVID-19 has occasioned “mounting consumer, political, and legislative pressure to cover pandemic-related business interruption claims”. Not only has the industry needed to embrace new technologies and practices to adapt to the pandemic, but it has also needed to address some of the COVID-driven burdens faced by clients. 


Safe and secure disposal of end-of-life IT hardware

First, your business needs to develop a plan of action that brings together your IT, information security and office management staff, with oversight from senior executives. To be fully effective, it should establish a decommissioning strategy that covers the compliant disposal of retired hardware and the destruction of data. Next, you need to ensure that all the data on your old hardware has been permanently eradicated and is non-recoverable. Given the importance of this step, it is likely that you’ll need assistance from a third-party disposition expert. Third, you need to know the whereabouts of your assets throughout the disposition process. A secure chain of custody is vital to prove compliance and so, once again, it is advisable to employ the services of an outside expert – a company that offers rigorous security practices, such as asset itemisation, GPS tracking and protected transportation, all backed up with supporting documentation. Having a secure chain of custody is critical because it ensures that the IT assets are tracked during each step of the process from pick-up to final disposition.



Quote for the day:

"The final test of a leader is that he leaves behind him in other men, the conviction and the will to carry on." -- Walter Lippmann

Daily Tech Digest - October 07, 2021

Encryption: Why security threats coast under the radar

This application of AI became a valuable source IT expertise that multiplied staff bandwidth to manage the solution and allowed for a full and complex monitoring of the entire networked environment. With Flowmon ADS in place, the institute has a comprehensive, yet noise-free overview of suspicious behaviours in the partner networks, flawless detection capability, and a platform for the validation of indicators of compromise. Flowmon’s solution works at scale too. GÉANT – which is a pan-European data network for the research and education community – is one of the world’s largest data networks, and transfers over 1,000 terabytes of data per day over the GÉANT IP backbone. For something of that scale there is simply no way to manually monitor the entire network for aberrant data. With a redundant application of two Flowmon collectors deployed in parallel, GÉANT was able to have a pilot security solution to manage data flow of this scale live in just a few hours. With a few months of further testing, integration and algorithmic learning, the solution was then ready to protect GÉANT’s entire network from encrypted data threats.


In The Digital Skills Pipeline, A Shift Away From Traditional Hiring Modes

“As digital transformation accelerates and we experience generational shifts, professionals will increasingly desire better work-life balance and freedom from legacy in-office models,” says Saum Mathur, chief product, technology and AI officer with Paro. “Consultancies and others that are reliant on legacy models are struggling to adapt to this new reality, and marketplaces are only furthering these models’ disruption. Three to five years ago, the gig economy pioneers offered customers finite, task-based services that didn’t require extensive experience and enabled flexible scheduling. With continued shifts in the technical and cultural landscape, the gig economy has been extended into professional services, which is powered by highly experienced subject matter experts of all levels.” Corporate culture needs to be receptive to the changes wrought by digital transformation. Forty-one percent of executives in the Alliantgroup survey have encountered employee resistance, while 32$ say they have had “the wrong team or department overseeing initiatives.”


Remote-working jobs: Disaster looms as managers refuse to listen

The Future Forum Pulse survey echoed a sentiment that has been voiced repeatedly over the past 18 or so months: employees have embraced remote working, and see it as a pillar of their future working preferences. Yet executives are more likely than lower-level workers to be in favour of a working week based heavily around an office. Of those surveyed, 44% of executives said they wanted to work from the office every day, compared to just 17% of employees. Three-quarters (75%) of executives said they wanted to work from the office 3-5 days a week, versus 34% of employees. This disconnect between employer and employee preferences risks being entrenched into new workplace policies, researchers found. Two-thirds (66%) of executives reported they were designing post-pandemic workforce plans with little to no direct input from employees – and yet 94% said they were "moderately confident" that the policies they had created matched employee expectations. What's more, more than half (56%) of executives reported they had finalized their plans on how employees can work in the future. 


Will the cloud eat your AI?

"CSPs' cloud and digital services have given them access to the enormous amounts of data required to effectively train AI models," the authors concluded. Such economies of scale have been an asset to the cloud providers for years. Years ago, RedMonk analyst Stephen O'Grady highlighted the "relentless economies of scale" that the cloud providers brought to hardware–they could simply build more cheaply than any enterprise could hope to replicate in their own data centers. Now the CSPs enjoy a similar advantage with data. But it's not merely a matter of raw data. The CSPs also have more experience using that data on a large scale. The CSPs have products (e.g., Amazon Alexa to assist with natural language processing, or Google Search to help with recommendation systems). Lots of data feeding ever-smarter applications feeding more data into the applications... it's a self-reinforcing cycle. Oh, and that hardware mentioned earlier? The CSPs also have more experience tuning hardware to process machine learning workloads at scale. 


Operationalizing machine learning in processes

Operationalizing ML is data-centric—the main challenge isn’t identifying a sequence of steps to automate but finding quality data that the underlying algorithms can analyze and learn from. This can often be a question of data management and quality—for example, when companies have multiple legacy systems and data are not rigorously cleaned and maintained across the organization. However, even if a company has high-quality data, it may not be able to use the data to train the ML model, particularly during the early stages of model design. Typically, deployments span three distinct, and sequential, environments: the developer environment, where systems are built and can be easily modified; a test environment (also known as user-acceptance testing, or UAT), where users can test system functionalities but the system can’t be modified; and, finally, the production environment, where the system is live and available at scale to end users.


MLOps essentials: four pillars for Machine Learning Operations on AWS

Managing code in Machine Learning appliances is a complex matter. Let’s see why! Collaboration on model experiments among data scientists is not as easy as sharing traditional code files: Jupyter Notebooks allow for writing and executing code, resulting in more difficult git chores to keep code synchronized between users, with frequent merge conflicts. Developers must code on different sub-projects: ETL jobs, model logic, training and validation, inference logic, and Infrastructure-as-Code templates. All of these separate projects must be centrally managed and adequately versioned! For modern software applications, there are many consolidated Version Control procedures like conventional commit, feature branching, squash and rebase, and continuous integration. These techniques however, are not always applicable to Jupyter Notebooks since, as stated before, they are not simple text files. Data scientists need to try many combinations of datasets, features, modeling techniques, algorithms, and parameter configurations to find the solution which best extracts business value.


Why Unsupervised Machine Learning is the Future of Cybersecurity

There are two types of Unsupervised Learning: discriminative models and generative models. Discriminative models are only capable of telling you, if you give it X then the consequence is Y. Whereas the generative model can tell you the total probability that you’re going to see X and Y at the same time. So the difference is as follows: the discriminative model assigns labels to inputs, and has no predictive capability. If you gave it a different X that it has never seen before it can’t tell what the Y is going to be because it simply hasn’t learned that. With generative models, once you set it up and find the baseline you can give it any input and ask it for an answer. Thus, it has predictive ability – for example it can generate a possible network behavior that has never been seen before. So let’s say some person sends a 30 megabyte file at noon, what is the probability that he would do that? If you asked a discriminative model whether this is normal, it would check to see if the person had ever sent such a file at noon before… but only specifically at noon.


Sorry, Blockchains Aren’t Going to Fix the Internet’s Privacy Problem

Recently, a number of blockchain-based companies have sprung up with the vision of helping people take control of their data. They get an enthusiastic reception at conferences and from venture capitalists. As someone who cares deeply about my privacy, I wish I thought they stood a better chance of success, but they face many obstacles on the road ahead. Perhaps the biggest obstacle I see for personal-data monetization businesses is that your personal information just isn’t worth that much on its own. Data aggregation businesses run on a principle that’s sometimes referred to as the “river of pennies.” Each individual user or asset has nearly zero value, but multiply the number of users by millions and suddenly you have something that looks valuable. That doesn’t work in the reverse, however. Companies are far more focused and disciplined in the pursuit of millions of dollars in ad or data revenue than one consumer trying to make $25 a year. But why isn’t your data worth that much? Very simply, the world is awash in your information, and you’re not the only source of that information. The truth is that you leak information constantly in a digital ecosystem.


Iranian APT targets aerospace and telecom firms with stealthy ShellClient Trojan

The Trojan is created with an open-source tool called Costura that enables the creation of self-contained compressed executables with no external dependencies. This might also contribute to the program's stealthiness and to why it hasn't been discovered and documented until now after three years of operation. Another possible reason is that the group only used it against a small and carefully selected pool of targets, even if across geographies. ShellClient has three deployment modes controlled by execution arguments. One installs it as a system service called nhdService (Network Hosts Detection Service) using the InstallUtil.exe Windows tool. Another execution argument uses the Service Control Manager (SCM) to create a reverse shell that communicates with a configured Dropbox account. A third execution argument only executes the malware as a regular process. This seems to be reserved for cases where attackers only want to gather information about the system first, including which antivirus programs are installed, and establish if it's worth deploying the malware in persistence mode.


How financial services can invest in the future with predictive analytics

Predictive analytics empowers users to make better decisions that consider what has happened and what is likely to happen based on the available data. And those decisions can only be made if employees understand what they’re working with. They need good data literacy competencies to understand, challenge, and take actions based on the insights, with greater abilities to realise the limitations and question the output of predictive analytics. After all, a forecast’s accuracy depends on the data fuelling it, so its performance could be impacted during an abnormal event or by intrinsic bias in the dataset. Employees must have confidence in their understanding of the data to question its output. This is particularly true when decisions could directly impact customers’ lives, particularly the influential impact of those made in the financial sector – from agreeing to an overdraft and making it to payday to approving a mortgage application in time. 



Quote for the day:

"All leadership takes place through the communication of ideas to the minds of others." -- Charles Cooley

Daily Tech Digest - September 14, 2021

Honing Cybersecurity Strategy When Everyone’s a Target for Ransomware

While not all hackers are out for the money, if they are, they become particularly crafty at plying their trade. What malicious actors are often looking for are the “keys to the kingdom” — the most lucrative mission-critical information, passwords, contacts or accounts — which is usually found within the C-suite. And not only do C-suite targets have the most valuable organizational data, but they are also the decision-makers of whether to pay a ransom. This creates two situations that put executives under even greater threat. First, it makes a ransomware attack on a C-suite decision maker incredibly efficient, which achieves maximum ROI for threat actors. Second, it makes a C-suite executive’s personal communications incredibly valuable and particularly vulnerable. The tighter cybercriminals can twist the screws with embarrassing business and private communications threatened for release, the greater their chances for payment – and often, the more they can demand. The sad reality is that the majority of executives, and particularly their direct reports, are incredibly soft targets.


What Do Engineers Really Think About Technical Debt?

It's no surprise that technical debt causes bugs, outages, quality issues and slows down the development process. But the impact of tech debt is far greater than that. Employee morale is one of the most difficult things to manage, especially now that companies are switching to long-term remote work solutions. Many Engineers mentioned that technical debt is actually a major driver of decreasing morale. They often feel like they are forced to prioritize new features over vital maintenance work that could improve their experience and velocity and this is taking a significant toll. ... More than half of respondents claim that their companies do not deal with technical debt well, highlighting that the divide between engineers and leadership is widening rather than closing. Engineers are clearly convinced that technical debt is the primary reason for productivity losses, however, they seem to be struggling to make it a priority. Yet, making the case for technical debt could help engineers ship up to 100% faster. As much as 66% of Engineers believe the team would ship up to 100% faster if they had a process for technical debt. 


Human-Machine Understanding: how tech helps us to be more human

Human-Machine Understanding, or HMU, is one of the lines of enquiry currently getting me out of bed in the morning, and I’m sure that it will shape a new age of empathic technology. In the not-too-distant future, we’ll be creating machines that comprehend us, humans, at a psychological level. They’ll infer our internal states – emotions, attention, personality, health and so on – to help us make useful decisions. But let’s just press pause on the future for a moment, and track how far we’ve come. Back in 2015, media headlines were screaming about the coming dystopia/utopia of artificial intelligence. On one hand, we were all doomed: humans faced the peril of extinction from robots or were at least at risk of having their jobs snatched away by machine learning bots. On the other hand, many people – me included – were looking forward to a future where machines answered their every need. We grasped the fact that intelligent automation is all about augmenting human endeavour, not replacing it.

Essential Soft Skills for IT leaders in a Remote World

People in positions of authority often aim to project unbreakable confidence, but a better path to building connections is through honesty. Foremost, being open about insecurities, uncertainties, and failures is humanizing—a critical trait in the age of Zoom. Conversely, ultra-strict managers may find their teammates become reticent to speak up about risks they see. Such an environment is an anathema to multidisciplinary IT fields, given the need for transparent workflows. Being vulnerable at work is not only about you trying to show something to your teammates, it is also about establishing and growing a safe environment for the colleagues you work with. In my experience, it’s hard for people to speak up about sensitive topics like challenges, difficult conversations or if they don’t agree with someone at work. But these aspects are much easier when the team, including leadership, has built an environment, where everyone trusts that they are free to express their opinions and share their feelings about their work.

The past, present and future of IoT in physical security

As ever, the amount of storage that higher-resolution video generates is the limiting factor, and the development of smart storage technologies such as Zipstream has helped tremendously in recent years. We will likely see further improvements in smart storage and video compression that will help make higher-resolution video possible. Cybersecurity will also be a growing concern for both manufacturers and end users. Recently, one of Sweden’s largest retailers was shut down for a week because of a hack, and others will meet the same fate if they continue to use poorly secured devices. Any piece of software can contain a bug, but only developers and manufacturers committed to identifying and fixing these potential vulnerabilities can be considered reliable partners. Governments across the globe will likely pass new regulations mandating cybersecurity improvements, with California’s recent IoT protection law serving as an early indicator of what the industry can expect. Finally, ethical behavior will continue to become more important. A growing number of companies have begun foregrounding their ethics policies, issuing guidelines for how they expect technology like facial recognition to be used — not abused.


Leading under pressure

“There is a well-accepted and common wisdom that success breeds confidence, and that confidence helps you handle pressure better,” explained Jensen. “My read, without having talked to Simone Biles or knowing exactly what is going on in her head, is that there is a countervailing force to that positive cycle, which is that as you accrue status and visibility, the ‘importance’ piece gets greatly magnified. The stakes expand. They begin to encompass your self-worth and the weight of the 330 million people you are carrying along for the ride.” Business leaders are subject to this phenomenon, too. As they reach higher levels of the corporate hierarchy, the importance of their decisions and actions grows, and the stakes rise. And like pressure itself, the element of importance is a double-edged sword. ... How do you manage importance during these peak pressure moments? The secret is to understand that how you perceive the stakes in any given situation can be controlled. “When you get into peak pressure moments, all you can think about is how important [the stakes are], what you might gain, what you might lose,” said Jensen.


IT leaders facing backlash from remote workers over cybersecurity measures: HP study

Ian Pratt, global head of security for personal systems at HP, said the fact that workers are actively circumventing security should be a worry for any CISO. "This is how breaches can be born," Pratt said. "If security is too cumbersome and weighs people down, then people will find a way around it. Instead, security should fit as much as possible into existing working patterns and flows with unobtrusive, secure-by-design and user-intuitive technology. Ultimately, we need to make it as easy to work securely as it is to work insecurely, and we can do this by building security into systems from the ground up." IT leaders have had to take certain measures to deal with recalcitrant remote workers, including updating security policies and restricting access to certain websites and applications. But these practices are causing resentment among workers, 37% of whom say the policies are "often too restrictive." The survey of IT leaders found that 90% have received pushback because of security controls, and 67% said they get weekly complaints about it.


OSI Layer 1: The soft underbelly of cybersecurity

The metadata from a switch can indicate whether a rogue device is present. This can be accomplished without mirroring traffic to respect privacy within sensitive IT environments. Supply chain exposure is more complex than managing where you order from: It’s a two-fold problem involving both software and hardware. It’s understood that many applications bundle libraries and controls from third parties that are further outside of your purview. Attackers exploit weaknesses and defects from an array of targets, including unsecured source code, outdated network protocols (downgrade attacks), unsecured third-party servers, and update mechanisms. Software safeguarding software is under your control: deploying least privilege principles, endpoint protection, and due diligence to audit and assess third party partners are essential and reasonable precautions. Hardware is another story altogether. It’s less obvious when a fully functioning Raspberry Pi has been modified or telecommunications equipment has been compromised by a state actor, as it looks and plays the part without any irregularities.


Desensitized To Devastation: Strategies For Reaching CISOs In Today’s Cyber Landscape

Hackers only need to be right once. One set of compromised credentials puts them on their way to snatching your critical assets. Security teams, on the other hand, have to be right all the time. There’s no logging off at the end of the 9-to-5 workday for criminals. They’re active when you’re awake, they’re active when you’re asleep and they’re active when you’re celebrating the holidays with your families. All it takes is one right guess of a password and a company could lose millions of dollars, customer data, its reputation and its stock price — and the CISO could lose their job. Businesses can’t afford to have weak security infrastructures that aren’t monitoring for and shutting down threats 24/7. ... Ransomware was up 93% in 2021 from 2020, according to Check Point, and we’ve recently suffered some major cyberattacks. The country has been hit with attacks that have massive implications for daily life and business, like the Colonial Pipeline and Kaseya attacks. And external threats aren’t all we have to worry about. 


Bad News: Innovative REvil Ransomware Operation Is Back

Unfortunately, with its infrastructure coming back online, REvil appears to be back. Notably, all victims listed on its data leak site have had their countdown timers reset, Bleeping Computer reports. Such timers give victims a specified period of time to begin negotiating a ransom payment, before REvil says it reserves the right to dump their stolen data online. REvil is one of a number of ransomware operations that regularly tells victims that it's stolen sensitive data, before it forcibly encrypts systems and threatens to leak the data if they don't pay. But REvil's representatives have been caught lying before, by claiming to have stolen data as they extort victims into paying, only to admit later that they never stole anything. Why might the infrastructure have come back online, including the payments portal, which accepts bitcoin and monero? Numerous experts have suggested REvil was just laying low in the wake of the Biden administration pledging to get tough. Perhaps the main operators and developers opted to relocate to a country from which it might be safer to run their business. Or maybe they were just taking a vacation.



Quote for the day:

"You have two choices, to control your mind or to let your mind control you." -- Paulo Coelho