Showing posts with label digital signature. Show all posts
Showing posts with label digital signature. Show all posts

Daily Tech Digest - December 20, 2022

Ransomware: It’s coming for your backup servers

Backup and recovery systems are at risk for two types of ransomware attacks: encryption and exfiltration – and most on-premises backup servers are wide open to both. This makes backup systems themselves the primary target of some ransomware groups, and warrants special attention. Hackers understand that backup servers are often under-protected and administered by junior personnel that are less well versed in information security. And it seems no one wants to do something about it lest they become the new backup expert responsible for the server. This is an age-old problem that can allow backup systems to pass under the radar of sound processes that protect most servers. It should be just the opposite. Backup server should be the most updated and secure systems in the data center. They should be the hardest to login to as Administrator or root. And they should require jumping through the most hoops to login remotely. An important role backup servers play is providing the means to recover from a ransomware attack without paying the ransom. 


How We Improved Application’s Resiliency by Uncovering Our Hidden Issues Using Chaos Testing

The goal of chaos engineering is to educate and inform the organization of unknown vulnerabilities and previously unanticipated outcomes of a computer system. A primary focus of these complex testing procedures is to identify hidden problems that can potentially arise during production environments prior to an outage failure outside of the organization’s control. Only then can the disaster recovery team address systematic weaknesses and enhance the system’s overall fault-tolerance and resiliency. Hence, Chaos testing is being carried out at various levels. ... Chaos testing is a new concept, but we always had the mindset to perform it, and we did perform it sometimes, without knowing that it was a chaos testing. It has its own principles, benefits and pitfalls. However, I would advise all teams to weigh the pros and cons of conducting these tests before formulating a plan. You should be very clear as to what you want to achieve from these disruptive tests. Take permissions from your bosses and convince them why it is important to carry out these tests. 


How Our Behavioral Bad Habits Are a Community Trait and Security Problem

Internal naming groups and conventions become exposed to the outside world in a variety of ways. They're buried in website code, detailed in technical documentation or as part of APIs, or just simply published in public system information. Admittedly, this is a very large haystack, but finding the needles is exactly what the patent I was involved in (US Patent 10,515,219) endeavors to do. Site-scanning tools collect a range of information, and unsurprisingly, an overload of information. My approach strips out all the technical programming information (such as markup, JavaScript, etc.), and leaves just words. It then compares results with lists of English words. The algorithm then identifies groupings of words or abbreviations not present in the selected language that, presumably, may signify an internal naming convention or credentials. As is common with brute-force campaigns, it may not, but as the axiom goes, the attacker only needs to be right once, so the ability to generate context-sensitive word lists may make or break your next campaign. This is when the picture may start to become clearer and the shape of things such as user groups, system names, etc., manifest.


The Agile Compromise Calls for Courage

We cannot eliminate the risk of building the wrong thing with better design and more focus groups. We can only do it by shorter and more frequent iterations of a working product that is incomplete and maybe not that great yet. That’s the Agile compromise. Shorter cycles decrease the risk of building the wrong thing but increase the risk of degrading the process. Accruing technical debt is one. Not a necessary consequence, just a standard price to pay for quicker deliveries. No pundit with stories about the constant commitment to quality will convince me otherwise. If you want greater speed, accept more risks. The Agile compromise towards risk-taking also recognizes that software, as a creative discipline, by nature exposes black swans: the risks we didn’t know we would ever run into. No engineering approach provides full reassurance against them, nor can testing and validation ever give you full peace of mind. It’s a little bit scary, but if you pride yourself on an Agile mindset, you must embrace it. Software is complex rather than complicated. Its many moving parts behave in unpredictable ways when unleashed on the world. Risks are a natural part of that property.


GPT: High-tech parlor trick or the first real AI for everyday use?

In many cases, such as ChatGPT, AI is still a parlor trick that will enthrall us until the next trick comes along. In some cases, it’s a useful technology that can augment both human and machine activities through incredibly fast analysis of huge volumes of data to propose a known reaction. You can see the promise of that in the GPT-fueled Copysmith.AI even as you experience the Potemkin village reality of today. At a basic level, AI is pattern matching and correlation done at incredible speeds that allow for fast reactions — faster than what people can do in some cases, like detecting cyberattacks and improving many enterprise activities. The underlying algorithms and the training models that form the engines of AI try to impose some sense onto the information and derived patterns, as well as the consequent reactions. AI is not simply about knowledge or information, though the more information it can successfully correlate and assess, the better AI can function. AI is also not intelligent like humans, cats, dogs, octopi, and so many other creatures in our world.


How you can stop corporate login credential theft

Organizations should take a layered approach to credential management. The goal is to reduce the number of sites users have to put passwords into. Organizations should endeavor to implement single sign-on (SSO) for all reputable necessary work applications and websites. All SaaS providers should support SSO. If there are logins that require different credentials, a password manager would be helpful in the interim. This also provides a way for employees to know if a login page can be trusted, as the password manager won’t offer credentials up for a site it does not recognize. Organizations should also enable multi-factor authentication (MFA) to secure logins. FIDO2 is also gaining adoption. It will provide a more robust solution than traditional authenticator apps, although those apps are still better than codes sent via text messages. Not all of this is foolproof, and risky login pages could slip through the net. A last resort is needed for flagging risky login pages to employees. This can be done by analyzing, in real time, threat intelligence metrics, webpage similarities, domain age and how users got to a login page. 


Security Risks, Serious Vulnerabilities Rampant Among XIoT Devices in the Workplace

The potential intent of assorted Chinese hardware manufacturers (such as Huawei and ZTE) led to a 2018 ban on use of their equipment by federal agencies. These devices remain widely in use in private organizations, however, and sometimes banned devices slip through dragnets via the process of “white labeling”. Organizations also often do not have visibility into the code that XIoT devices run on. When these devices draw on third-party firmware libraries, several possible security risks emerge. One is simply that the vendor will abandon support for the device, no longer issuing security patches to address emerging vulnerabilities. Another is that the code may be maintained by open source developers, who have the capability to insert malicious elements or even abandon or spike the project unexpectedly. A simple problem that has dogged XIoT devices from the very beginning also remains; the manufacturers are often not tech outfits and thus are not familiar with security by design elements, and/or do not have the budget in place to add them and still come in at their desired price points in competitive markets.


Understanding e-signatures: the key differences and requirements

A QES is considered to have more probative value than an AES, which means that courts will give more weight as evidence. The first key difference is that they offer a higher level of security than AES. This is because qualified signatures are created using a qualified signature creation device (QSCD), which stores the signing key. Examples of physical QSCDs include smart cards, SIM cards or USB tokens. It’s also possible for signatories to create a QES without having a physical device in their hands. In this instance, signatories remotely access a signing key, which is stored in a trusted service provider’s data centre. This is often the preferred choice for organisations since it streamlines device management. A QES must also be based on a ‘qualified certificate for electronic signatures’, which is another key difference between an AES and a QES. Only ‘Qualified trust service providers’ (QTSPs) listed on the European Union’s trusted provider database can issue this certificate. To become a QTSP, organisations must successfully complete a series of evaluations and audits that ensure compliance with eIDAS regulations.


Top cloud strategy mistakes CIOs can’t help making

“Not architecting for the cloud,” says IDC analyst Dave McCarthy, when asked where CIOs commonly go wrong when building their cloud strategy. “While it is possible to ‘lift and shift’ existing workloads, enterprises often experience less than desirable costs and performance with this approach. You need to adapt applications to cloud-native concepts to realize the full value.” CIOs also often make the mistake of “not implementing enough automation,” says McCarthy, who is research vice president of cloud and edge infrastructure services for IDC. “Best practices in cloud include automating everything from the deployment of infrastructure and applications to management and security. Most outages or security breaches are the result of manual misconfigurations. But perhaps the worst sin CIOs can make, analysts across the spectrum agree, is fail to plan for the shift in culture and skills required to devise and implement a successful cloud strategy. The cloud functions differently than traditional IT systems, and a cloud strategy must not only require new skills but a change in thinking about how to design and manage the environment, McCarthy says.


What is a business architect and how do you become one?

An enterprise architecture is comprised of different kinds of components (business strategy and outcome, technology platforms and infrastructure, and security), and a business architecture encompasses how all these things come together to best serve the business. It is a component of enterprise architecture. My responsibility as a business architect is to manage the business architecture practice and its governance. I primarily focus on establishing standards and best practices for our team's deliverables and developing relationships within our organization. I also collect information on our business and map domains (including capabilities, value streams, information, and organization) according to the business architecture framework to gain insights. I think business architecture is foundational to organizations today. A strategy is a plan of action to achieve a goal. My team receives business ideas and potential projects that align with our organization's strategies and influence our performance as a leader in our market. 



Quote for the day:

"The leader has to be practical and a realist, yet must talk the language of the visionary and the idealist." -- Eric Hoffer

Daily Tech Digest - July 19, 2021

IoT security: Development and defense

While IoT adoption continues to grow, the standards, compliance requirements and secure coding practices surrounding IoT have not advanced at the same rate. Recent high profile software supply chain attacks have brought the issue of secure coding into sharp focus, prompting the Biden administration to issue an executive order addressing new requirements for federal agencies to only purchase and deploy secure software. This pivotal shift will have an immediate impact on global software development processes and lifecycles, especially when you consider the vast reach of U.S. federal procurement. Virtually all device manufacturers and software companies will be impacted directly as the administration begins to increase obligations on the private sector and establish new security standards across the industry. Specific to IoT, the order directs the federal government to initiate pilot programs to educate the public of the security capabilities of IoT devices, and to identify IoT cybersecurity criteria and secure software development practices for a consumer-labeling program.


Efficient unit-testing with a containerised database

The real problem is mixing two languages in one body of code. The dbUtil handle is just a boilerplate reduction device here. The raw SQL is still there. We still can’t test the complex individual statements separate from the simple yet crucial control logic captured in the if-statements, which depend solely on the state of the person object, not on the database. Sure, we can test this control logic fine if we mock out the calls to the database. The mock for dbUtil returns a prepared list of person objects, and we can verify the correct invocation of it for the two different conditions. That unavoidably leaves the SQL untested. If we want to test the execution of these statements, we need to run the entire code inside the for loop, this time using a real database. That test needs to set up the conditions for all the three execution paths (condition 1, 1 and 2, or none), as well as verify what happened to the state after executing the void statement executions. It can be done, but we are of necessity testing both the Java and SQL realms here. That’s hardly the lean unit testing we’re looking for.


Ansible vs Docker: A Detailed Comparison Of DevOps Tools

Ansible is an open-source automation engine that helps in DevOps and comes to the rescue to improve your technological environment’s scalability, consistency, and reliability. It is mainly used for rigorous IT tasks such as configuration management, application deployment, intraservice orchestration, and provisioning. In recent times, Ansible has become the top choice for software automation in many organizations. Automation is one of the most crucial aspects of industries these days. Unfortunately, many IT environments are too complex and often require to be scaled too quickly for system administrators and developers to keep up, rather than manually. ... Docker is an open-source platform application for developing, shipping, and running applications. It enables developers to package applications into containers, a set of standardized and executable components that combine the application source code with the operating system libraries and dependencies required to run that code in an executable environment. Containers can even be created without Docker, but the platform and user interface make it easier, simpler, and safer to build, deploy and manage containers. 


Delegation and Scale: How Remote Work Affected Various Industries

The basic goal of delegation of authority is to enable efficient organization. Just as no single individual in a company can do all of the tasks required to achieve a group's goals, it becomes arduous for the management to wield all decision-making authority as a business expands. This is because there is a limit to the number of people a manager can successfully monitor and make decisions. When this threshold is reached, the authority must be handed to subordinates. While centralization was still a possibility before the pandemic, this was no longer the case after back-to-back lockdowns and economic slowdowns. In such a situation, the delegation came as a boon that not only kept the workflow active but also helped in scaling the growth. ... Delegating gives your team greater confidence, makes them feel important, and allows them to demonstrate their abilities. This will result in mutual appreciation with colleagues motivating one another to work more, and staying devoted to attaining the goals. 


Seeking a Competitive Edge vs. Chasing Savings in the Cloud

If companies do not make changes to their IT operations in response to a migration, finding savings can be more difficult, L’Horset says. “In the industry, there’s a lot of debate: Is cloud saving you money or not? Our research indicates that even at the basic level, yes it does,” he says. “The difference between the cost-savings, which you can get through cloud, and the value of innovation that you absolutely can and should get through cloud, is the fundamental reason you should go.” Roy Illsley, chief analyst with Omdia, the research arm of Informa Tech, says the cost benefits of cloud can be positive if the workload is variable in its resource requirements, its resource requirements match the cloud providers packaging of resources, or it requires high availability. "If the workload is stable in its resource requirements then on-premises is more cost effective," he says. Respondent companies to the Accenture survey that did not list cloud as a top priority still saw significant cost-savings, says Jim Wilson, managing director of information technology and business research at Accenture Research. 


7 Ways AI and ML Are Helping and Hurting Cybersecurity

AI/MI is used in network traffic analysis, intrusion detection systems, intrusion prevention systems, secure access service edge, user and entity behavior analytics, and most technology domains described in Gartner's Impact Radar for Security. In fact, it's hard to imagine a modern security tool without some kind of AI/ML magic in it. ... Through social engineering and other techniques, ML is used for better victim profiling, and cybercriminals leverage this information to accelerate attacks. For example, in 2018, WordPress websites experienced massive ML-based botnet infections that granted hackers access to users' personal information; ... Ransomware is experiencing an unfortunate renaissance. Examples of criminal success stories are numerous; one of the nastiest incidents led to Colonial Pipeline's six-day shutdown and $4.4 million ransom payment; ... ML algorithms can create fake messages that look like real ones and aim to steal user credentials. In a Black Hat presentation, John Seymour and Philip Tully detailed how an ML algorithm produced viral tweets with fake phishing links that were four times more effective than a human-created phishing message.


Electronic signatures: please sign on the digital line

First, let’s look at the importance of content to a business. In simple terms, content is the inherent value of a company. It’s NASA’s designs for their new space station, AstraZeneca’s highly regulated pharmaceutical patents, and Oxfam’s humanitarian aid records. It’s the clinical trial results for the next breakthrough vaccine, or the blueprint for the innovative new approach to flooding solutions. Content is the entire work of an organisation and is completely unique for every company. Content is the database of its most valuable insights. But to effectively realise this value, organisations need to find a single place for their content. Separating content between different silos and applications creates friction, which can stand in the way of employees accessing and sharing information, inhibiting innovation and productivity. Applications in today’s content-driven world are often judged by their ease of integration with other technologies. As a result, businesses are turning to single platforms where content can be securely stored and managed, while all compliance requirements are met and all teams have the opportunity to collaborate on the content, both internally and externally.


Protect your smartphone from radio-based attacks

An IMSI catcher is equipment designed to mimic a real cell tower so that a targeted smartphone will connect to it instead of the real cell network. Various techniques may be employed to do it, such as masquerading as a neighboring cell tower or jamming the competing 5G/4G/3G frequencies with white noise. After capturing the targeted smartphone’s IMSI (the ID number linked to its SIM card), the IMSI catcher situates itself between the phone and its cellular network. From there, the IMSI catcher can be used to track the user’s location, extract certain types of data from the phone, and in some cases even deliver spyware to the device. Unfortunately, there’s no surefire way for the average smartphone user to notice/know that they’re connected to a fake cell tower, though there may be some clues: perhaps a noticeably slower connection or a change in band in the phone’s status bar (from LTE to 2G, for example). Thankfully, 5G in standalone mode promises to make IMSI catchers obsolete, since the Subscription Permanent Identifier (SUPI) – 5G’s IMSI equivalent – is never disclosed in the handshake between smartphone and cell tower. 


The value of data — a new structural challenge for data scientists

Some companies with data scientists in place have difficulty operationalising their skills. If we look at the volumes of data processed by organisations, the different structures and architectures, it is not imperative to have a data scientist in its ranks of data experts. For companies managing an astronomical amount of data, on multiple channels and with a complex structure, the expertise of a data scientist will prove beneficial in modeling data, query it and make predictions. One of the first questions to ask is therefore related to data and business needs and to organise the structure according to an organisation’s structure and its data strategy. Companies have also realised that having a data scientist was not the answer to their data value problems. This is partly due to a lack of understanding in the environment surrounding data. A data scientist may understand the data, but not its purposes and environments or business applications. Let’s take the example of a marketing department working on implementing AI to accelerate its web ROI. 


Interview With Prof B Ravindran, Head, Robert Bosch Centre For Data Science & AI

Interpretability of deep learning models is essential for widespread adoption of these techniques in the Medical image diagnosis community. Deep learning models have been phenomenally successful at beating state of the art in common medical image diagnosis tasks like segmentation and screening applications, e.g. classification of diabetic retinopathy and chest X-ray scans, among others. While these successes have created huge interest in adopting these techniques in clinical practice, a huge barrier in adoption is the lack of interpretability of these models. Convolutional Neural Networks with hundreds of layers is the workhorse for medical image diagnosis. While the initial layers are typically edge detectors and shape detectors, it is fairly impossible to explain or interpret the feature maps as one goes deeper into the network. In order for clinicians to trust the output from these networks, it is essential that a mechanism for explaining the output be present. In addition, black-box techniques will make it hard for clinicians to justify the diagnosis and follow up procedures.



Quote for the day:

"Honor bespeaks worth. Confidence begets trust. Service brings satisfaction. Cooperation proves the quality of leadership." -- James Cash Penney