Daily Tech Digest - January 22, 2020

This new startup aims to make developers love security

Security breach, system hacked alert with red broken padlock icon showing unsecure data under cyberattack, vulnerable access, compromised password, virus infection, internet network with binary code
The accelerating speed of development teams and the failings of traditional perimeter and agent-based security solutions in cloud-native environments. With regard to the first trend, writes Vadlamani: It allows engineers to specify their infrastructure composition in a declarative language, allowing them to use the same versioning and release management workflows as for their source code. It greatly simplifies the work associated with deployment, testing and rollbacks. It allows them to be truly agile, spinning up new services in rapid succession to respond to changing business needs, and massively reduces the "busy work" associated with setting up the right environment and providing the runtime for their software. While that's great for development, it potentially creates new security issues that traditional security solutions are a poor fit to solve. Even the best security teams may struggle with threat detection and incident response in this cloud-native world. Perimeter defenses don't really work in this environment. In addition, it's difficult to deploy agents across these new ephemeral solutions, and requires security teams to manually manage changing policies, certify deployments, and respond to alerts.


Microsoft discovers new sLoad 2.0 (Starslord) malware  

malware skull cyber
According to a Microsoft report from December 2019, sLoad had become one of the few malware strains that ported its entire host-server communications systems to the Windows BITS service. For those unfamiliar with the term, Windows BITS is the default system through which Microsoft sends Windows updates to users all over the world. The BITS service works by detecting when the user is not using their network connection and utilizing this downtime to download Windows updates. But the BITS service is not entirely reserved for the Windows Update process. Other apps can tap into BITS and use it to schedule tasks and network operations to take place when the computer's network connection goes idle. The sLoad authors appear to be some of the biggest fans of this service. Microsoft says that the malware's entire network stack was configured to work via the Windows BITS service of an infected host. The malware would set up BITS scheduled tasks that would execute at regular intervals. These tasks would be used to talk with its C&C server, download secondary malware payloads, and even send data from an infected host back to the C&C server


AI Will Give Rise To FinTech 2.0 And Longevity Banks

FinTech technology AI artificial intelligence
In the next few years, age-friendly FinTech companies and Longevity Banks will develop new financial products designed for clients who are planning to live extra long lives and want to remain high functioning and financially stable throughout. Clients of Longevity Banks will have more time to accumulate wealth, will have a longer investment horizon, and will benefit from compounding. Financial services innovators have an opportunity to enhance the financial lives of a billion people by designing new solutions and adapting existing products and services. ... The Longevity AI Consortium at King’s College London is developing sophisticated methods for translating advanced AI for Longevity solutions including novel applications of life data for insurance companies, pension funds, healthcare companies, and government bodies. This year the Consortium is planning to expand to Switzerland, Israel, Singapore, and the US. Progressive investment banks, pension funds, and insurance companies are developing new business models, and are using AI to improve the quality of the analytics used to formulate them.


How Google’s influence on AI is becoming key to business success

How Google’s influence on AI is becoming key to business success image
Google is the obvious version of that, especially as they’re so far ahead, but some of the good practices we can adopt to make things better for Google, such as consistency and clean, structured data) will be useful throughout business transformations. The past decade alone has seen huge leaps in Google’s capabilities and focus on AI, with the arrival of Google Assistant in 2016 and the Neural Matching algorithm introduced in 2018, to deliver more diverse search results by analysing language on a deeper level than previous algorithms. For the first time, Google could match words to concepts and figure out what a user wanted from a looser search. Google’s commitment only looks set to grow, with the company’s co-founder, Larry Page, taking a particularly close interest in AI, revealing that: “Google will fulfil its mission only when its search engine is AI-complete.” Google is definitely, absolutely committed to AI. Larry Page is committed, and it is viable in both short-term, by being able to out-analyse competition, and the long-term, the consequences of being the first creator of anything resembling true AI is vast.


A brave new workless world


Daniel Susskind, an Oxford professor and former government advisor, believes that work “is so entrenched in our psyches that there is often an instructive resistance to contemplating a world with less of it, and an inability to articulate anything substantial when we actually do.” His argument in A World without Work, a useful and farsighted book on the subject, is threefold: that within our lifetime, automation will result in insufficient work to go around; that this structural technological unemployment, if ignored, would make our already unfair world vastly more unequal; and that to prevent this outcome, governments’ approach to labor policy needs to be entirely rethought. Of these three strands, Susskind’s first is his most convincing. To be sure, he acknowledges, workers have regularly panicked unnecessarily about being replaced by machines. But this time, he argues, the threat is real. His best evidence of the frightening pace at which AI is developing comes through attempts to build robots to play chess and Go. For years, scientists followed an approach of trying to copy human thought and behavior.


The Human Screenome Project will capture everything we do on our phones

One of the biggest obstacles to the project’s success is likely to be that it raises fears around privacy. Having an app quietly record your activity every five seconds is a hard sell. If the past few years have shown anything, it’s that even the most inane activities online are tracked. That information is sold to advertisers at best, or to hackers and disinformation campaigns at worst. The Cambridge Analytica scandal highlighted how personality tests shared between acquaintances on Facebook were weaponized by Russians in the 2016 American election, for example. And consider what passes across our phone screens every day: bank account information; emails carrying personal data; car-sharing routes detailing addresses of destinations; meal delivery orders; texts with our loved ones; photos and videos of children; even pornography, cryptocurrency exchanges, and illicit activity. “It’s a lot of sensitive information,” Reeves concedes. His team has amassed around 30 million screenshots from volunteers in the US as well as China and Myanmar.


Modern Android App Architecture with JetPack and Dropbox Store


Dropbox recently took ownership of the open-source Store library to revamp it and bring it closer to the current Android developer ecosystem. Originally developed at the New York Times, Store has been rewritten in Kotlin on the foundations provided by Coroutines and Flow. Along with Google JetPack collection of libraries, Dropbox Store provides a solution to create modern Android apps. When Google introduced JetPack, it set an ambitious goal for it: accelerating development of high-quality apps for the Android platform. Two key ideas drove the design of JetPack towards that goal. On the one hand, JetPack aims to leverage advanced Kotlin features to reduce boilerplate code programmers had to write. On the other hand, it also provides higher-level abstractions on top of those found in the Android SDK, such as Fragments and Activities, to allow them to express complex tasks in a simpler way. JetPack includes a number of components that can be used independently on one another and cover four main concern areas: Foundation, Architecture, Behavior, and User Interface, as shown in the following image.


How remote work rose by 400% in the past decade


The report found that the rise of remote work popularity is thanks to the evolution of supporting technologies including powerful mobile devices, ultra-fast internet connections, and proliferation of cloud-based storage and SaaS solutions. "The rise of cloud-based SaaS software has been instrumental to the growth of remote work," de Lataillade said. "Employees can now instantly connect and collaborate with colleagues around the world at any time." Employees definitely took advantage: The majority (78%) of employees said they work remotely some of the time; more than half (58%) said they work remotely at least once a month; and, 36% of respondents said they work remotely at least once a week, the report found. While 36% might not seem like a huge percentage, it's a significant jump from 10 years ago. In 2010, the US Census Bureau found that only 9.5% of employees worked remotely at least once a week, indicating that the number of people working remotely on a weekly basis has grown by nearly 400% in the last decade, according to the report.


Microsoft and Google just can't agree on proposed ban on facial recognition


Speaking at a conference in Brussels on Monday, Pichai said it was important for governments to tackle regulatory questions over facial recognition and, more broadly, AI "sooner rather than later", and that the ban can be "immediate but maybe there's a waiting period before we really think about how it's being used".  ... "Accountability is an important part of our AI principles. We want our systems to be accountable and explainable and we test it for safety," Pichai told the thinktank Bruegel, which organized the conference. "I think inevitably doing that we assume it will involve human agency and humans to review it, and we specifically mention we want these systems to be accountable to society at large. And I think regulation should play a role in that as well." The European Commission acknowledges in its proposal that a temporary ban on facial recognition would "be a far-reaching measure that might hamper the development and uptake of this technology", therefore it would prefer to use existing regulatory instruments available under GDPR.


The Role of Developers in Digital Transformation

Developers Role Digital Transformation
Firstly, efficient code delivered in a timely manner should be the goal of every developer. Testing and QA will, of course, reveal issues, and no one is perfect, but developers must be mindful of the impact inefficient or inaccurate code can have on a business. Workarounds may get you past a problem, but this is not best practice and leaves the business exposed should a client pick up on it–especially if the workaround causes more serious issues further down the line. Fewer bugs or defects means scrum meetings can naturally focus more on strategy and new initiatives that might help grow the business. Organizations should therefore establish best practices and coding guidelines to reduce the temptation for workarounds, ensuring code releases are reliable and ready for the production environment. Developers must also be able to comprehend the requirement to work with designs given to them by design teams, as these will align with customer expectations. Being able to dynamically incorporate client feedback into the development process is also key, particularly for those working with continuous delivery and continuous integration pipelines.



Quote for the day:


"Everyone carries a bucket of water and a bucket of gas in life. A leader has learned to throw the right one at the right time." - Orrin Woodward


Daily Tech Digest - January 21, 2020

How low-code helps CIOs accelerate digital transformation

How low-code helps CIOs accelerate digital transformation image
As digital transformation has become the main agenda, CIOs are using technology strategically and leveraging digital opportunities. The fact that in 2019, 40% of technology spending (more than $2 trillion) is estimated to have been assigned to digital transformation initiatives, adoption of emerging technology has become the biggest objective for enterprises. The app economy plays a crucial in driving digital transformation and business innovation. CIOs have to consider the people, platforms, and processes that will cater to the increasing demand for modern applications. The increasing demand for enterprise applications has led to the increasing adoption of low-code platforms in the Application Development & Delivery (AD&D) market. Enterprises are working towards leveraging agile practices and incorporating development techniques to create a minimum viable product (MVP). CIOs and IT leaders have to determine what practices, what type of technology and the skills required to achieve modernisation.



.NET Core: Writing Really Obvious Code with Enumerated Values in gRPC Web Services


gRPC services support using enumerated values (enums) when creating the .proto file that drives your gPRC service and the clients that access it (for more on how that works, see the column I wrote on creating gRPC services and clients). Since the definitions of the messages that you send to and receive from a gRPC service are converted into C# classes, defining enums in your .proto file gives you the same ROCing benefits that defining enums in your code does. ... If you prefer Pascal-cased names in your code then you'll need to deploy underscores strategically. To get CreditLimit as the name of your enumerated value, you'll need to name the field using an underscore before "limit" in your .proto file (e.g. Credit_Limit, CREDIT_LIMIT, or credit_limit would do the trick). One last note on the default value: A client can't tell the difference between a property that's been set to the default value for your enum and a property that hasn't been set at all. A best practice, therefore, would be to make the default value for your enum (the one in position 0) to be the "no value available" option and never use it. That way a client can tell when the property hasn't been set.


Solving the Big Data, Small Returns Problem in 2020


All technicians are humans, but not all humans are technicians. If we are going to build a new world, we should build it from the ground up and from a human-centric angle. We need to flip the model of thinking from data and tech-first to use case first. To make this possible, we finally need to get around to answering three basic, yet wildly complicated questions: What data do we have? Where is it? And how do we get value from it? We’ve learned that having more data doesn’t equate to having better insights. So we need to collect data specific to our questions. We still have to work with legacy architectures and infrastructures that have been cobbled together over time, and data in various forms from different sources that were never designed to work together in harmony. So we need to be meticulous about where we keep our data and how we organize it, so that it’s visible and accessible. And as far as getting value from data, we need to put the human element back into analytics. The analytics will only be as good as the person that asks challenging questions. That person should not have to have a technical background to do so.


Why The Digital Economy Is Set For A Correction

uber eats
There’s certainly an expectation that one of the consequences of Digital is that things just get cheaper and consequently, we can consume more of them. But how do these things become cheaper and what are the consequences of falling real prices? Two things, above all else, have contributed to the decline in real prices for the digital-meets-physical category: taxation and algorithmically driven labour efficiency. Tax minimisation is facilitated by an international tax regime dating back to the 1920s, when it was reasonable to tax corporations based on physical presence. This doesn’t really work in a world of transfer pricing, where rents can be extracted from subsidiaries in high-tax locations for the use of corporate intangible assets such as brands, patents, and software, thus minimising profits. Taxation will, eventually, get sorted. France and, surprisingly, the UK seem to be leading the way on this. The decline of unit labour inputs is another matter. If we think about the design of, for example, a work management system used in a warehouse, its major purpose is to avoid employee downtime.


The Move to Multiple Public Clouds Creates Security Silos


Often when organizations migrate from on-premise to public cloud environments, security teams want to continue to use the same approach for protecting applications and data. But use of a public cloud, especially multiple public clouds, introduces new attack vectors that require better visibility into what is happening across the entire ecosystem. Security tools offered by public cloud vendors are often a popular choice to fill the gap following migration. The majority of respondents who said that their organizations used public cloud environments indicate that they selected native security tools or a combination of native tools with third-party solutions to secure their public cloud. Possible reasons for organizations adopting a heterogeneous approach to securing public clouds might be because public cloud vendors are not cybersecurity experts and typically provide best-of-breed security tools vs. a 360-degree holistic security solution.


Can an AI be an inventor? Not yet.

AI inventor
For Abbott, the fact that we are not at the point where machines are routinely inventors is part of the point: society, he argues, needs to figure this out early. He acknowledges that AI doesn’t just spring into existence—it must be coded and trained and fed data—but that doesn’t necessarily mean everything an AI creates can or should be traced back to humans. Hundreds or thousands of people might be involved in programming IBM’s supercomputer Watson with general problem-solving capabilities, but “if Watson then applies those capabilities and solves a particular problem in a way that results in a patent, it’s not clear that anything any of those people have done qualifies them to be an inventor,” Abbott says. But if humans can’t be listed as inventors because they weren’t intimately involved, and the AI can’t be listed as an inventor either, then the invention may not be patentable at all. This, Abbott suggests, could be problematic. It could prevent companies from investing money in AI technologies and prevent breakthroughs in important areas like drug discovery.


Why employees can pose the biggest cloud migration challenge image
While IT departments can guarantee corporate technology is working as it should, they can’t always control the people using it or what devices they may wish to use. So, steps need to be taken to ensure that whatever the device used by employees, they do not become easy pickings for the cybercriminals who pose a threat to the corporate network. The first step is to educate the workforce on those threats. With people being asked for multiple passwords when accessing online accounts these days, it’s common for employees to choose something that’s easy to remember. But easy to remember also means easy to guess. It’s common to hear of hackers successfully cracking passwords by using personal information they have siphoned from social media – whether that’s your favourite football team or the names of your children. It’s advisable for IT departments to work with HR to alert employees to the dangers of weak passwords – along with other cyber-attack techniques, such as phishing on email.


Google CEO Sundar Pichai: This is why AI must be regulated

Microsoft's recent calls for government regulation have focused on the use of facial-recognition technology in public spaces, arguing that if left unchecked it will increase the risk of biased decisions and outcomes for groups of people already discriminated against. The timing of Pichai's post is unlikely to be a coincidence. Euractiv reporters last week published a leaked European Commission proposal touting a three- to five-year ban on facial-recognition technology by public and private-sector organizations in public spaces until regulators can develop solid methods for assessing the risks of the technology and risk-management approaches. "This would safeguard the rights of individuals, in particular against any possible abuse of the technology. It would be necessary to foresee some exceptions, notably for activities in the context of research and development and for security purposes (subject to a decision issued by a relevant court)," the Commission wrote.



"The short answer is that Rust solves pain points present in many other languages, providing a solid step forward with a limited number of downsides," explains Jake Goulding on Stack Overflow's blog. Goulding is the co-founder of Rust consultancy Integer 32, so he has a vested interest in Rust's success, but he's also not alone in taking a shine to the young language. Microsoft is experimenting with Rust to reduce memory-related bugs in Windows components. Every single bug costs Microsoft on average $150,000 to patch and in 2018 there were 468 memory issues it needed to resolve. Over the past decade, more than 70% of the security patches it has shipped addressed memory-related bugs. Rust concepts are also being used in Microsoft's recently open-sourced Project Verona, an experimental language for safe infrastructure programming that could help Microsoft securely retain legacy C and C# code.  Mozilla Research describes Rust as a "systems programming language that focuses on speed, memory safety, and parallelism". It's often seen as an alternative to systems programming languages like C and C++ that developers use to create game engines, operating systems, file systems, browser components, and VR simulation engines.


5 IT Operations Cost Traps and How to Avoid Them

On a first look, centralization contradicts the spirit of DevOps and Agile. Agile teams want to be self-sufficient. They want to have all needed skills on their team so they don’t depend on external, centralized help to deliver their sprints. While such self-sufficiency is a guiding principle, DevOps teams always rely on some centralized teams. Hopefully, no DevOps team considers building their own data centers or trying to manage the OS level with all virus scanning and patch management by themselves. So, the real questions are — what must be sourced to a centralized team for cost, compliance, or other reasons? In which areas are project or product teams free to choose to do the work themselves, even if there is a centralized team for this topic? Figure 2 below illustrates this ecosystem of standard services. Ultimately, every company and IT organization has to ensure that teams, Agile or not, perform activities and make decisions in line with overall company goals and the CIO’s strategy for IT. They define the boundaries within which all Agile or non-Agile and DevOps or old-fashioned development and operations teams act.



Quote for the day:


"Leave every person you interact with feeling better about themselves; feeling loved & appreciated."  --Wright Thurston


Daily Tech Digest - January 19, 2020

Get Your Enterprise Ready for 5G

Image: Tham Yuan Yuan - Pixabay
5G is an opportunity to re-imagine your business and to think about what you could do in your company if you weren't constrained by limited bandwidth and slow data transfer speeds. In healthcare, the elimination of communications constraints could mean a broader ability to deploy telemedicine and telesurgery to remote areas. In manufacturing, unleashing the potential of communications could bring an endless opportunity to manage all types of Internet of Things (IoT) appliances and robotics in factories around the world. In cities, unbridled communications could deliver limitless ways to manage traffic grids and fleets of autonomous vehicles. However, in other business cases, what you're already doing today with 4G, or even with 2G or 0G, might be enough. The discussion about present, short-term future and long-term business directions, and the communications that are needed to support them, should occupy the CIO, other C-level executives, corporate technology experts and boards of directors.



Cyber-Physical Systems – The new and emerging systems of intelligence


With edge devices – pieces of hardware that control data flow at the boundary between two networks – becoming more powerful, miniaturised and inexpensive, there is an opportunity to bring AI, machine learning (ML) and real-time decision making closer to where data is produced. This involves building geo-distributed models that are privacy-aware and adapting decision-making algorithms based on context. Edge computing systems will form the basis for the smooth functioning of CPS, especially in time-sensitive tasks where even milliseconds matter, such as remote robotic surgeries or self-driving cars. They provide the much-needed, real-time insights to these systems so that they can operate and adapt in real-time. The Internet of Things (IoT) and smart devices have become an inseparable part of our everyday lives and many physical devices and everyday objects are now connected. In fact, according to IHS Markit there will be more than 125 billion connected devices globally by 2030.  However, as an increasing number of devices is integrated into enterprise networks, it is important to ensure that the existing systems are ready to yield the expected benefits and minimise risk.


The top 9 big data and data analytics certifications for 2020

Top Big Data Certifications Available Today
Data and big data analytics are the lifeblood of any successful business. Getting the technology right can be challenging but building the right team with the right skills to undertake data initiatives can be even harder — a challenge reflected in the rising demand for big data and analytics skills and certifications. If you're looking to get an edge on a data analytics career, certification is a great option. ... The number of data analytics certs is expanding rapidly. ... The Certification of Professional Achievement in Data Sciences is a non-degree program intended to develop facility with foundational data science skills. The program consists of four courses: Algorithms for Data Science, Probability & Statistics, Machine Learning for Data Science, and Exploratory Data Analysis and Visualization. ... The Certified Analytics Professional (CAP) credential is a general analytics certification that certifies end-to-end understanding of the analytics process, from framing business and analytic problems to acquiring data, methodology, model building, deployment and model lifecycle management. It requires completion of the CAP exam and adherence to the CAP Code of Ethics.


Financial Advisors Hate Bitcoin. Their Reasons Will Drive You Crazy

In the U.S., all financial advisors have fiduciary duty. This means they have to manage your money in a way that benefits you. If they don’t, you can sue them. You can do what you want with your own money. Buy all the bitcoin you want. Cow pies, lawn darts, options, credit default swaps, silver dollars, hammers, whatever you want to buy, no matter how risky or useless, you go for it. When you give money to financial advisors, they have to follow certain rules. They can’t mess around with crazy stock tips or risky off-shore investment schemes. ... In fact, crime is the number one reason 75 percent of all investors say they avoid bitcoin. Most people worry about getting hacked or think somebody will use bitcoin for terrorism or illegal activities. On top of that (and maybe because of it), most advisors don’t know how bitcoin works. Cryptocurrency isn’t covered in their professional certifications. ... Bitcoin has no central issuer, no government, and no business managing its use. Bitcoin transactions are pseudonymous, peer-to-peer, and settled instantly. 


Four priorities for the evolution of IT in 2020


IT efficiency is crucial to the success of digital transformation initiatives, and there is increased pressure on IT departments to deliver more, faster. However, IT can no longer keep up with the demands of the business; little over a third (36 per cent) of IT professionals were actually able to deliver all projects asked of them last year. In order to reduce this growing IT delivery gap, we’ll see IT move away from trying to deliver all IT projects themselves in 2020. The IT team’s role will evolve to changing, operating and securing core IT assets along with building and managing reusable APIs, exposing the functionality within the core IT assets that the rest of the business can consume to create the solutions they need. Essentially, IT begins to create new building blocks (APIs) that can empower both the technical and the broader lines of business users to innovate and build new technology solutions without compromising the core IT estate of the business. With API-led connectivity and organisations educating teams on the power of integration, IT will empower companies to digitally transform and innovate faster than ever before, shifting from being an “all doing” to an “enabling” organisation and avoiding being a constraint to business expansion.


Visa's plan against Magecart attacks: Devalue and disrupt

visas-vision-for-the-future-of-payments-5d30adac150bd000016556aa-1-jul-18-2019-19-44-30-poster.jpg
Visa's plan to devalue payment card data involves the rollout of new technologies like the Visa Token Service and Click To Pay systems. The Visa Token Service is a new payment mechanic through which payment card numbers and details are replaced by a token. This token validates the transaction against Visa's servers, but its useless to attackers as it doesn't contain any data cybercriminals can use to sell or clone cards. This novel tokenization system will be coupled with the new Click To Pay technology that Visa and fellow card providers have been working on for the past few few years, and which they recently began rolling out across the US. With Click To Pay, multiple card providers have banded together to create a common "Click to Pay" button that vendors can add to their online stores. Users only have to enter their card details once, and then click the button to buy products across the internet, without having to re-enter card details on each store. Since users don't have to enter card details on online stores, there's nothing Magecart hackers can steal. Both technologies were created to simplify online shopping, but they both happened to come along at the right time to help fight off Magecart attacks.


Microsoft: Application Inspector is now open source, so use it to test code security


The static source-code analyzer aims to help developers handle potential security issues that arise through code reuse when incorporating open-source components, such as software libraries, into a project. "Reuse has great benefits, including time to market, quality, and interoperability, but sometimes brings the cost of hidden complexity and risk," write Guy Acosta and Michael Scovetta, members of Microsoft's Customer Security and Trust team. "You trust your engineering team, but the code they write often accounts for only a tiny fraction of the entire application. How well do you understand what all those external software components actually do?" As they note, modern web applications often have hundreds of third-party components that contain tens of thousands of lines of code, which were written by thousands of contributors. And typically developers who use those components rely on the author's description, which Microsoft argues is not reliable or enough to meet Microsoft's responsibility for shipping secure code, which includes external components.


Natural disasters are increasing in frequency and ferocity. Here's how AI can come to the rescue

Once an advancing cyclone or hurricane is identified, for example, geo-spatial, weather and previous disaster data could be used to predict how many people will be displaced from their homes and where they will likely move. Such insights could help emergency personnel identify how much aid (water, food, medical care) will be needed and where to send it. AI algorithms could instantaneously assess flooding, building and road damage based on satellite images and weather forecasts, allowing rescuers to distribute emergency aid more effectively and identify those still in danger and isolated from escape routes. McKinsey’s Noble Intelligence is just one example of an initiative trying to harness AI’s potential to support humanitarian causes. For instance, the team is developing an algorithm that will reduce the time it takes to assess damage to buildings such as schools from weeks to minutes, using a combination of satellite, geo-spatial, weather and other data.


Does the World Need a Cryptocurrency Robo Advisor?


Robo Advisors as a service has been used on a global scale. Though, there is definitely a different scene running in different parts of the world, for instance comparing the US market with Europe.  The US retail market has shown much more interest and trust in using these computer programs to manage their money. This has alone made the US the source of innovation for Robo Advisors considering the competition between some heavyweight financial institutions trying to take a bite from the market share such as Vanguard or Charles Schwab and very bright startups such as Betterment, Wealthfront and Acorns. ... One challenge that remains for the market and the ETP providers is to keep liquidity for the indices they launch. Market liquidity across Cryptocurrencies, especially alternative coins (all non-bitcoin coins). There are specialized parties, called market makers using sophisticated tools for providing offers for both sides of order book. The tool, called also market making bot makes sure make sure such coins or indices have sufficient liquidity to attract investors or financial advisors.


Bipartisan group of senators introduces legislation to boost state cybersecurity leadership

In introducing the legislation, Hassan highlighted the ongoing nationwide ransomware attacks on cities and government entities. These types of attacks, which recently crippled the government of New Orleans, involve an individual or group locking up a system and demanding a ransom to give the user access again. “Cyberattacks can be devastating for communities across our country, from ransomware attacks that can block access to school or medical records to cyberattacks that can shut down electrical grids or banking services,” Hassan said in a statement. “The federal government needs to do more to ensure that state and local entities have the resources and training that they need to prevent and respond to cyberattacks.” Hassan added that the new bill “would take a big step forward in improving communication between the federal government, states, and localities, as well as strengthening cybersecurity preparedness in communities across the country.”



Quote for the day:



"The led must not be compelled; they must be able to choose their own leader." -- Albert Einstein


Daily Tech Digest - January 18, 2020

EU mulls 5-year ban on facial recognition tech in public spaces

People walk past a poster simulating facial recognition software at the Security China 2018 exhibition on public safety and security in Beijing, China October 24, 2018.
The EU Commission said new tough rules may have to be introduced to bolster existing regulations protecting Europeans’ privacy and data rights. “Building on these existing provisions, the future regulatory framework could go further and include a time-limited ban on the use of facial recognition technology in public spaces,” the EU document said. During that ban of between three to five years, “a sound methodology for assessing the impacts of this technology and possible risk management measures could be identified and developed.” Exceptions to the ban could be made for security projects, as well as research and development, the paper said. The document also suggested imposing obligations on both developers and users of artificial intelligence and that EU countries should appoint authorities to monitor the new rules. The Commission will seek feedback on its white paper before making a final decision, officials said.



Huawei and 5G: Why the UK's decision is getting tougher every day


There are serious issues for the UK to consider here. These 5G networks will at some point underpin everything from smart cities to augmented-reality surgery. They have to be secure and unbreakable. An outage of a 5G network controlling an automated factory or motorway full of self-driving cars could be disastrous, especially if it could be triggered at-will by a foreign state. Espionage is another, more obvious and realistic fear. No nation would want its most sensitive data to be read by another. And few would dispute that the Chinese state has regularly used cyber espionage against other governments and businesses. So, first, there is the fundamental issue: can Huawei's equipment be trusted as part of the UK's critical infrastructure? It's a question that the UK's intelligence agencies and technical experts have been pondering long and hard. Up to now their answer has been that, so long as Huawei's kit is limited to the outer reaches of these new 5G networks, the risk is manageable. Huawei's equipment has long been used in UK networks without incident, and the country of origin is not the only, and not even a primary, factor when it comes to assessing security.


Forecast: the top 6 cybersecurity trends for 2020

cybersecurity privacy safety internet binary
Application Programming Interfaces (APIs) have become a vital component in modern IT infrastructures. They allow data to be readily shared between applications as well as opening access to external parties. While they offer significant benefits, they also create vulnerabilities that can be exploited by cybercriminals and incidents are set to rise during 2020. APIs are inherently insecure and offer an enticing entry point into an organisation’s IT infrastructure. The problem is particularly relevant in supply chains where data is shared between multiple parties. When access is provided to core systems via APIs, it becomes difficult – if not impossible – to ensure all links are secure at all times. ... Operational Technology (OT) is the hardware and software that manages devices within an organisation’s infrastructure. Most OT was designed years ago and was never intended to be networked or linked to the public internet. Fast forward to 2020 and OT is increasingly being connected to IT networks to allow remote monitoring and management.


How AI Is Manipulating Economics to Create Appreciating Assets


Think about that statement for a second…you’re buying an appreciating asset, not a depreciating asset. And what is driving the appreciation of that asset? It’s likely courtesy of Tesla’s FSD (Full Self-Driving) Deep Reinforcement Learning Autopilot brain. Tesla cars become “smarter” and consequently more valuable with every mile each of the 400,000 Autopilot-equipped cars are driven. Imagine a mindset of leveraging Deep Reinforcement Learning with new operational data to create products (vehicles, trains, cranes, compressors, chillers, turbines, drills) that appreciate with usage because the products are getting more reliable, more predictive, more efficient, more effective, safer and consequently more valuable. That’s H-U-G-E! An asset that appreciates in value through usage and learning is yet another example of how a leading organization can exploit the unique characteristics of digital assets that not only never deplete or wear out but can be used across an unlimited number of use cases at a near zero marginal cost.


Keeping up with disruptors through hybrid integration


We’re living in a period where information is key, and where companies in every industry are inundated with data from all sides. And this is only set to rise, with IDC predicting that the global datasphere will grow from 33 zettabytes in 2018 to 175 zettabytes by 2025. In terms of how this is stored, many organisations have initiated cloud-first policies, meaning no new data should be stored in their data centres. The reasons for this drive to the cloud are numerous given the number of business benefits. For example, the cloud provides unlimited storage and accessibility from anywhere in the world. While some companies already do everything in the cloud, the vast quantities of data collated by heritage organisations is stored across multiple data sources. It is therefore likely that these organisations will always have some systems stacked in heritage servers as a result of the costs involved, the data’s complexity and the inability to replicate it in the cloud. This means there is a need to integrate data and applications stored on-premise, in the cloud and between the two.


UK’s phone and internet bulk data surveillance unlawful, says EU court opinion


The Advocate General opinion argues that member states cannot use national security exemptions to escape from the safeguards of European law, when they impose legal obligations on telephone and internet companies to retain their customers’ data. Access to communications data must be subject to prior review or an independent administrative authority committed both to safeguarding national security and defending citizens’ fundamental rights and requests for data must be made in specific terms, the AG wrote. Data retention by telephone companies and internet service providers should be limited to specific categories of data that are essential for the prevention and control of crime and the safeguarding of national security, and each category of data should be held for a defined time.


New phishing attack hijacks email conversations: How companies can protect employees

domain-impersonation-levels-barracuda.jpg
Although the level of conversation hijacking in domain-impersonation attacks is low compared with other types of phishing attacks, they're personalized. That makes them effective, hard to detect, and costly, according to Barracuda. After impersonating a domain, cybercriminals begin the process of conversation hijacking. By infiltrating an organization, attackers will compromise email accounts and other sources. They then spend time monitoring the compromised accounts and reading emails to understand the business and learn about any deals, payment processes, and other activities. This step is also where they can snoop on email conversations between employees, external partners, and customers. Attackers will leverage the information they've picked up from the compromised accounts to devise convincing messages sent from the impersonated domain to trick employees into wiring money or updating and sharing payment information. The entire process of impersonating a domain, monitoring compromised accounts, and hijacking conversations can be expensive and time-consuming.


Mojo Vision is putting an augmented reality screen on a contact lens

The Mojo Lens is a contact lens with an augmented reality display.
Mojo Lens promises to deliver the useful and timely information people want without forcing them to look down at a screen or lose focus on the people and world around them. In terms of mass production, Mojo’s Invisible Computing platform won’t be ready for a while, but the prototypes are coming together. ... “It’s a rigid, gas-permeable lens,” he said. “It is super comfortable because it sits on the white part of your eye.” That’s like the hard contact lenses some people wear because they find the soft ones uncomfortable. The harder lens rests on your eye, rather than on your cornea (that is, it rests on the white part of your eye, rather than the part you see with). Mojo Vision plans to tailor each contact lens to fit the wearer’s eyes. “We want it to sit perfectly like a puzzle piece, and it doesn’t rotate and it doesn’t slip,” Sinclair said. “And that’s … one of the secrets that makes this whole thing work, and why anyone who’s trying to do this … with the soft contact lens is probably going to be miserable, because normal contact lenses are always moving around and sliding around and slipping and rotating.”


It’s the end for Windows Server 2008 support

Windows logo / life preserver / rescue / recovery / fix / resolve / solution
Server 2008 is based on the Windows Vista codebase, which should be reason alone to jettison it. But Windows Server 2016 and Windows Server 2019 are built on Windows 10, which means apps heavily dependent on the OS ecosystem might be hard to move since the internals are so different. “I do work with folks that are still running Windows Server 2008. They understand the ramifications of EOL for support. But most are in a predicament where they aren’t able to move the applications for a number of reasons, including application compatibility, location, etc.," Crawford says. For those apps that are challenging to move, he recommends isolating the system as much as possible to protect it, and putting in a plan to do what is needed to the applications to prepare them for movement as quickly as possible. Microsoft offers and recommends Azure migration, so Server 2008 apps can run in an Azure instance while they are modernized for Server 2019 and then deployed on premises. Migration should be the paramount effort, because if you are running Server 2008 then you're using hardware that's at least eight years old and potentially 12 years old.


What is Perfect Forward Secrecy? A Guide for 2020

Perfect Forward Secrecy
In short, the PFS acronym stands for “perfect forward secrecy,” which is a relatively recent security feature for websites. It aims to prevent future exploits and security breaches from compromising current or past communication, information or data by isolating each transaction’s encryption. Traditionally, encrypted data would be protected by a single private encryption key held by the server, which it could use to decrypt all the historic communication with the server using a public key. This presents a potential security risk down the line, as an attacker can spend weeks, months or years listening in to encrypted traffic, storing the data and biding their time. ... Perfect forward secrecy solves this problem by removing the reliance on a single server private key. Rather than using the same encryption key for every single transaction, a new, unique session key is generated every time a new data transaction occurs.  In effect, this means that even if an attacker manages to get their hands on a session key, it will only be useful for decrypting the most recent transaction, rather than all the data they may have collected in the past.



Quote for the day:


"The cost of leadership is self-interest." -- Simon Sinek


Daily Tech Digest - January 17, 2020

Dell Optiplex 7070 Ultra: Modularity at a price


The main trick with the Optiplex 7070 Ultra, and the reason it is designed as a thin brick, is that it fits in a specially designed monitor stand that attaches to Dell monitors. This feature is touted as being a desktop space saver, which it certainly is, but do not think that it is a cableless affair. We tested this Optiplex with a Dell UltraSharp 24 USB-C monitor -- which is a serviceable, thin-bezel 1920x1080 monitor that retails for AU$340, and if it had a high resolution, it would be outstanding -- and found the Optiplex to be a half-way house between a regular desktop and an all-in-one. For instance, a USB-C cable was still needed to make the connection between the unit and the monitor, both devices needed their own power cables and bricks, and connecting headphones meant reaching behind the monitor to find the audio jack and hoping they lack enough lead to allow you to relax in your seat. Consolidating things like power connections would put it much closer to the realm of an all-in-one, while probably making it increasingly complex, but simple changes like adding reachable ports and audio jacks into the stand to face the user would help with everyday usability.



Silicone’s Final Days? An Exclusive Chat With Nobel Prize Winner Sir Konstantin Novoselov

Novoselov, who grew up in a very heavy engineering environment, adds that the Nobel has opened opportunities in terms of collaboration within the industry itself and has “promoted huge interest”. “As we see now that interest paid back in terms of creation of new applications.” Today, graphene powers many disruptive technologies and holds the potential to open up many more new markets, particularly next-generation electronics: faster transistors, semiconductors, bendable phones, to name a few. But what is graphene, you ask? Graphene was originally observed in electron microscopes in 1958 and as Novoselov explains, it’s both an interesting and very simple material. “It’s only carbon atoms,” he explains. “Carbon is one of the lightest, and one of the simplest atoms you can think about.” Graphene is to date, the strongest and thinnest material known to science. In fact, it is 100 times stronger than steel despite its almost 100% transparency and flexibility. The material has also proved to be a good thermal and electrical conductor, also known to have unique quantum properties.


Scottish police roll out controversial data extraction technology


“We’re committed to providing the best possible service to victims and witnesses of crime. This means we must keep pace with society. People of all ages now lead a significant part of their lives online and this is reflected in how we investigate crime and the evidence we present to courts,” said deputy chief constable Malcolm Graham. He added that digital devices are increasingly involved in investigations, placing ever higher demand on digital forensic examination teams. “Current limitations, however, mean the devices of victims, witnesses and suspects can be taken for months at a time, even if it later transpires that there is no worthwhile evidence on them,” said Graham. “By quickly identifying devices which do and do not contain evidence, we can minimise the intrusion on people’s lives and provide a better service to the public.”


How to protect your organization and employees from conversation hijacking

Internet security and data protection concept, blockchain.
Cybercriminals use a variety of tricks to try to convince unsuspecting users to reveal sensitive and valuable information. Phishing is a well-known and general method. A more specific and direct technique gaining traction is conversation hijacking. By impersonating employees or other trusted individuals and inserting themselves in a message thread, criminals try to obtain money or financial information. But there are ways to protect your company and employees from this type of attack, according to a new report from Barracuda Networks. Here's how the process typically works, according to Barracuda. Cybercriminals start by impersonating an organization's domain. Through domain impersonation or spoofing, attackers send emails to employees with phony domain names that appear legitimate or create websites with altered names. Phony domain names can be concocted and registered by slightly adjusting certain characters in the actual name or changing the Top-Level-Domain (TLD), for example, replacing .com with .net.


Network automation with Python, Paramiko, Netmiko and NAPALM


Network automation with Python and automation libraries can enable simplified communication with network devices. In this article, we take a look at three network automation libraries: Paramiko, Netmiko and NAPALM, or Network Automation Programmability Abstraction Layer with Multivendor support. Each library builds on its predecessor to provide greater layers of abstraction that enable users to build more efficient automation systems. Paramiko is a low-level Secure Shell (SSH) client library. We can use it to programmatically control connecting to a network device's command-line interface (CLI) over a secure SSH connection. With the library, users send commands a person would normally type and parse the results of each command's execution, also known as screen scraping. The Python script below uses the Paramiko library to query a Cisco Catalyst 3560 router for its Address Resolution Protocol (ARP) table. It is the first step of a script to identify the switch port where a device is connected.


Artificial Intelligence System Learns the Fundamental Laws of Quantum Mechanics

Artificial Intelligence Quantum Mechanics
In Chemistry, AI has become instrumental in predicting the outcomes of experiments or simulations of quantum systems. To achieve this, AI needs to be able to systematically incorporate the fundamental laws of physics. An interdisciplinary team of chemists, physicists, and computer scientists led by the University of Warwick, and including the Technical University of Berlin, and the University of Luxembourg have developed a deep machine learning algorithm that can predict the quantum states of molecules, so-called wave functions, which determine all properties of molecules. The AI achieves this by learning to solve fundamental equations of quantum mechanics as shown in their paper ‘Unifying machine learning and quantum chemistry with a deep neural network for molecular wavefunctions’ published in Nature Communications. Solving these equations in the conventional way requires massive high-performance computing resources (months of computing time) which is typically the bottleneck to the computational design of new purpose-built molecules for medical and industrial applications.


California’s IoT cybersecurity bill: What it gets right and wrong

California's IoT cybersecurity bill
The most significant issue to be addressed is the law’s ambiguity: it requires all connected devices to have “a reasonable security feature” (appropriate to the nature of the device and the information it collects) that is designed to protect the user’s data from unauthorized access, modification, or disclosure. Beyond that vague prescription, the law only specifically states that each connected device must also come with a unique hard-wired password, or it must otherwise require a user to set their own unique password before using the device. Some experts maintain that meeting the password requirements is all that’s needed to satisfy the regulation; in effect, the password is the “reasonable security feature.” If this interpretation is validated, it’s wholly insufficient for securing the IoT – especially for those connected systems that reside in our appliances, vehicles, and municipal infrastructures.


Facial recognition is real-life ‘Black Mirror’ stuff, Ocasio-Cortez says

Because facial recognition is being used without our consent or knowledge, she suggested, we may be mistakenly accused of a crime and have no idea that the technology has been used as the basis for the accusation. That’s right, the AI Now Institute’s Whittaker said, and there’s evidence that the use of facial recognition is often not disclosed. That lack of disclosure is compounded by our “broken criminal justice system,” Ocasio-Cortez said, where people often aren’t allowed to access the evidence used against them. Case in point: the Willie Lynch case in Florida. A year ago, Lynch, from Jacksonville, Florida, asked to see photos of other potential suspects after being arrested for allegedly selling $50 worth of crack to undercover cops. The police search had relied on facial recognition: the cops had taken poor-quality photos of the drug dealer with a smartphone camera and then sent them to a facial recognition technology expert who matched them to Lynch.


Enterprises spend more on cloud IaaS than on-premises data-center gear

Google Stadia - Data Center
The major segments with the highest growth rates over the decade were virtualization software, Ethernet switches and network security. Server share of the total data center market remained steady, while storage share declined. "The decade has seen a dramatic increase in computer capabilities, increasingly sophisticated enterprise applications and an explosion in the amount of data being generated and processed, pointing to an ever-growing need for data center capacity," said John Dinsdale, chief analyst at Synergy Research Group, in a statement. However, more than half of the servers now being sold are going into cloud providers’ data centers and not those of enterprises, Dinsdale added. "Over the last ten years we have seen a remarkable transformation in the IT market. Enterprises are now spending almost $200 billion per year on buying or accessing data center facilities, but cloud providers have become the main beneficiaries of that spending."


Microsoft opens up Rust-inspired Project Verona programming language on GitHub


As Parkinson explained, Project Verona aims to help secure code in unsafe languages like C and C# that still exists in a lot of Microsoft's legacy code, which Microsoft can't afford to waste but would like to protect better. "We're going to run some C and C++, stuff we don't trust," Parkinson said at the talk. "We're going to put it in a box and we know there is this region of objects, we have to be very careful with it, but there's a group of things going on there and we can built some pervasive sandboxing there. So there can be sandboxed libraries that we can embed in our sandboxed Verona program." The GitHub page for Project Verona outlines some of the high-level questions the group is working on that will be fleshed out in forthcoming peer-reviewed articles. ... "Project Verona is a research project that is not affecting engineering choices in the company," it states. "The Project Verona team is connected to the people using all the major languages at the company, and want to learn from their experience, so we can research the problems that matter."



Quote for the day:


"Real leadership is being the person others will gladly and confidently follow." -- John C. Maxwell


Daily Tech Digest - January 16, 2020

How to get started with CI/CD

How to get started with CI/CD
Continuous integration and continuous delivery require continuous testing, because the goal is to deliver high quality and secure applications and code to end users. Continuous testing is often deployed as a set of automated regression, performance, and other tests that are executed within the pipeline. CI and CD together (CI/CD) encompass a culture, a set of operating principles, and a collection of practices that accelerate the software development process. The implementation is also known as the CI/CD pipeline and is considered one of the best practices for devops teams. Industry experts say more organizations are implementing CI/CD as they look to enhance the design, development, and delivery of software applications to be used internally or by customers. “We’re definitely seeing a rise in the use of CI/CD,” says Sean Kenefick, vice president and analyst at research firm Gartner. “I personally get questions about continuous development, testing, and release all of the time.”



Beware of this sneaky phishing technique now being used in more attacks


Cyber criminals are leaning hard on this attack technique as a means of compromising businesses, according to new research from Barracuda Networks. Analysis of 500,000 emails showed that conversation hijacking rose by over 400% between July and November last year. While conversation-hijacking attacks are still relatively rare, the personal nature means they're difficult to detect, are effective and potentially very costly to organisations that fall victim to campaigns. For cyber criminals conducting conversation-hijacking attacks, the effort involved is much greater than simply spamming out phishing emails in the hope that a target clicks, but a successful attack can potentially be highly rewarding. In most cases, the attackers won't directly use the compromised account to send the malicious phishing message – because the user could notice that their outbox contains an email that they didn't send. However, what conversation hijackers do instead is attempt to impersonate domains, using techniques like typo-squatting – when a URL is the same as the target company, save for one or two slightly altered changes.


11 Golden Rules For Android App Development


One of the golden rules of the Android Application Development includes Responsive User Interface. It engages the users into highly-intuitive apps that enhance their experience as well as cater to their requirements. Also, it is built by setting the viewpoint right by fixing the width so that everything in the screen can be adjustable according to the screen size. Moreover, the additional elements such as images, videos, or frames should be organized in such a way that it best fit in all types of screen sizes. ... Prototypes can be the right choice for showcasing the power of different technologies. In the world of digitalization, nobody would like to read the article but will surely love the digital presentation. After you identify the approach, you should build the prototype with basic functionalities and present it to the potential buyers so that they can understand the benefits of it. The prototype would help in attracting potential customers as they will be able to use the live project and would better understand the scope of the project.


Introduction to Gaps and Islands Analysis

One of the most significant challenges we face when analyzing data is pattern recognition. We seek to find ways in which our data deviates from the norm or conforms to a given norm. The goal is to identify tools that can be used to predict future behavior and make sense out of large volumes of data. Understanding boundaries and where a pattern begins or ends allows us to draw meaningful conclusions regarding our data. In terms of data, boundaries are more often seen as gaps or islands within any data set. Being able to efficiently locate gaps and islands enables us to use this data to gain meaningful insight into a system. We can identify winning and losing streaks, measure the strength of a system over time, find missing or duplicate data, and a variety of other interesting metrics. Within a data set, an island of data is any ordered sequence where each row is in close proximity to the rows around it. For some data types and analysis, “close proximity” will mean consecutive.


The Flutter Architecture


The Flutter SDK allows you to build Android, iOS, web, and desktop apps from a single codebase. This is done using platform-specific features as well as media queries, and it enables developers to ship applications faster. Flutter also offers close- to-instant feedback with the hot reload feature, enabling you to iterate quickly on your application. In this piece, we’ll cover the fundamental concepts you need in order to start working with Flutter. Flutter’s core technologies are Dart— a programming language developed by Google—and Skia — a 2D graphics rendering library. The language has been optimized for building user interfaces. This makes it a good fit for the Flutter framework. The language is fairly easy to pick up, especially if you have a background in JavaScript and object-oriented programming generally. In Flutter, you define your user interface using widgets. In fact, everything in Flutter is a widget. Your application itself is a widget made up of several sub-widgets. All the widgets form what is known as a widget tree.


Diligent Engine: A Modern Cross-Platform Low-Level Graphics Library

Graphics APIs have come a long way from a small set of basic commands allowing limited control of configurable stages of early 3D accelerators to very low-level programming interfaces exposing almost every aspect of the underlying graphics hardware. The next-generation APIs, Direct3D12 by Microsoft and Vulkan by Khronos are relatively new and have only started getting widespread adoption and support from hardware vendors, while Direct3D11 and OpenGL are still considered industry standard. New APIs can provide substantial performance and functional improvements, but may not be supported by older platforms. An application targeting wide range of platforms has to support Direct3D11 and OpenGL. New APIs will not give any advantage when used with old paradigms. It is totally possible to add Direct3D12 support to an existing renderer by implementing Direct3D11 interface through Direct3D12, but this will give zero benefits.


Tolerable security risk is a spectrum

Tolerable security risk is a spectrum
All enterprises are different. Each company stores and manages different types of data sets. They have different applications and processes in place. The ones in specific industries, such as healthcare and finance, have compliance restrictions that can be a nightmare. The notion is simple. Everyone has different security needs, and differences in data they are protecting. Thus, they should be on different parts of the security spectrum. For instance, in my earlier example, if the breached company were a tire manufacturer, spending four times the previous year’s security budget may be overspending, or not aligning with where it sits on the spectrum—just being reactionary. Yes, I’m making sweeping generalizations. Most tire manufacturers don’t deal with personally identifiable information the way that healthcare organizations do. Nor do they have to keep up with stringent auditable logging, as is required by most banks. Moreover, the data is probably fairly innocuous considering that the database information is about customers that are just a bunch of tire retailers—data that could be easily found on the website. Also, they don’t pay with credit cards, so none of that information is stored


Web developers: Microsoft Blazor lets you build native iOS, Android apps in C#, .NET

Microsoft announced Blazor in early 2018 but still considers Blazor an experimental web UI framework from ASP.NET that aims to bring .NET applications to all browsers via WebAssembly.  "It allows you to build true full-stack .NET applications, sharing code across server and client, with no need for transpilation or plugins," Microsoft explains. Microsoft is experimenting with Blazor and Mobile Blazor Bindings to cater to developers who are familiar with web programming and "web-specific patterns" to create native mobile apps. The idea behind releasing the mobile bindings now is to see whether these developers would like to use the "Blazor-style programming model with Razor syntax and features" as opposed to using XAML and Xamain.Forms. However, the underlying UI components of Mobile Blazor Bindings are based on Xamarin.Forms. If the feedback is positive, Microsoft may end up including it in a future version of Visual Studio, according to Lipton.


'Cable Haunt' Modem Flaw Leaves 200 Million Devices at Risk  

'Cable Haunt' Modem Flaw Leaves 200 Million Devices at Risk
The research team has dubbed such attacks Cable Haunt and says "an estimated 200 million cable modems in Europe alone" are at risk. They say every cable modem they have tested has been at risk, although some internet service providers have now developed and deployed firmware that mitigates the problem. Broadcom says it issued updated firmware code to fix the flaw eight months ago. "We have made the relevant fix to the reference code and this fix was made available to customers in May 2019," a spokeswoman tells Information Security Media Group. Service providers who have issued a patch will have based it on Broadcom's code updates. The vulnerability, originally codenamed "Graffiti," was discovered and has been disclosed by Alexander Dalsgaard Krog, Jens Hegner Stærmose and Kasper Kohsel Terndrup of Danish cybersecurity consultancy Lyrebirds, together with independent security researcher Simon Vandel Sillesen. Has the flaw been abused by attackers in the wild? "Maybe," the researchers write on the Cable Haunt site.


DRaaS decisions: Key choices in disaster recovery as a service


Self-service DRaaS involves the customer planning, buying, configuring, maintaining and testing disaster recovery services. And, although options for automation are improving, the IT team will typically need to be available to invoke the DR plan and run the recovery process. The benefits are flexibility and often cost. The business can choose exactly which mix of recovery services, backup and recovery software, and even the raw storage, it needs. A self-service model can lend itself to mixed environments, with multiple cloud data stores and application-based availability and DR tools. ... Managed DRaaS is the most comprehensive, but also the most expensive, option. The main benefit is that in-house IT teams can hand off DR operations entirely to the third party. This reduces the burden on skilled staff. And, although a managed service is typically more expensive than other DR options, it can be money well spent for a comprehensive service and peace of mind.



Quote for the day:


"The speed of the leader is the speed of the gang." -- Mary Kay Ash