Showing posts with label open data. Show all posts
Showing posts with label open data. Show all posts

Daily Tech Digest - September 19, 2020

Why we need XAI, not just responsible AI

There are many techniques organisations can use to develop XAI. As well as continually teaching their system new things, they need to ensure that it is learning correct information and does not use one mistake or piece of biased information as the basis for all future analysis. Multilingual semantic searches are vital, particularly for unstructured information. They can filter out the white noise and minimise the risk of seeing the same risk or opportunity multiple times. Organisations should also add a human element to their AI, particularly if building a watch list. If a system automatically red flags criminal convictions without scoring them for severity, a person with a speeding fine could be treated in the same way as one serving a long prison sentence. For XAI, systems should always err on the side of the positive. If a red flag is raised, the AI system should not give a flat ‘no’ but should raise an alert for checking by a human. Finally, even the best AI system should generate a few mistakes. Performance should be an eight out of ten, never a ten, or it becomes impossible to trust that the system is working properly. Mistakes can be addressed, and performance continually tweaked, but there will never be perfection.


What classic software developers need to know about quantum computing

There are many different parts of quantum that are exciting to study. One is quantum computing using quantum to do any sort of information processing, the other is communication itself. And maybe the third part that doesn't get as much media attention but should is sensing, using quantum computers to sense things much more sensitively than you would classically. So think about sensing very small magnetic fields for example. So the communication aspect of it is just as important because at the end of the day it's important to have secure communication between your quantum computers as well. So this is something exciting to look forward to. ... So the first tool that you need, and one of the most important tools is the one that gives you access to the quantum computers. So if you go to quantum-computing.ibm.com and create an account there, we give you immediate access to several quantum computers, which first of all, every time I say, this just blows my mind because four years ago this wasn't a thing. You couldn't go online and access a quantum computer. I was in grad school because I wanted to do quantum research and needed access to a lab to do this work


Why Darknet Markets Persist

"There are two main reasons here: the lack of alternatives and the ease of use of marketplaces," researchers at the Photon Research Team at digital risk protection firm Digital Shadows tell Information Security Media Group. At least for English-speaking users, such considerations often appear to trump other options, which include encrypted messaging apps as well as forums devoted to cybercrime or hacking. And many users continue to rely on markets despite the threat of exit scams, getting scammed by sellers or getting identified and arrested by police. Another option is Russian-language cybercrime forums, which continue to thrive, with many hosting high-value items. But researchers say that, even when armed with translation software, English speakers often have difficulty coping with Russian cybercrime argot. Many Russian speakers also refuse to do business with anyone from the West. ... Demand for new English-language cybercrime markets continues to be high because so many existing markets get disrupted by law enforcement agencies or have administrators who run an exit scam. Before Empire, other markets that closed after their admins "exit scammed" have included BitBazaar in August, Apollon in March and Nightmare in August 2019.


Open Data Institute explores diverse range of data governance structures

The involvement of different kinds of stakeholders in any particular institution also has an effect on what kinds of governance structures would be appropriate, as different incentives are needed to motivate different actors to behave as responsible and ethical stewards of the data. In the context of the private sector, for example, enterprises that would normally adopt a cut-throat, competitive mindset need to be incentivised for collaboration. Meanwhile, cash-strapped third-sector organisations, such as charities and non-governmental organisations (NGOs), need more financial backing to realise the potential benefits of data institutions. “Many [private sector] organisations are well-versed in stewarding data for their own benefit, so part of the challenge here is for existing data institutions in the private sector to steward it in ways that unlock value for other actors, whether that’s economic value from say a competition point of view, but then also from a societal point of view,” said Hardinges. “Getting organisations to consider themselves data institutions, and in ways that unlock public value from private data, is a really important part of it.”


5 supply chain cybersecurity risks and best practices

Falling prey to the "it couldn't happen to us" mentality is a big mistake. But despite clear evidence that supply chain cyber attacks are on the rise, some leaders aren't facing that reality, even if they do understand techniques to build supply chain resilience more broadly. One of the biggest supply chain challenges is leaders thinking they're not going to be hacked, said Jorge Rey, the principal in charge of information security and compliance for services at Kaufman Rossin, a CPA and advisory firm in Miami. To fully address supply chain cybersecurity, supply chain leaders must realize they need to face the risk reality. The supply chain is veritable smorgasbord of exploit opportunities -- there are so many information and product handoffs in even a simpler one -- and each handoff represents risks, especially where digital technology is involved but easily overlooked. ... Supply chain cyber attacks are carried out with different goals in mind -- from ransom to sabotage to theft of intellectual property, Atwood said. These cyberattacks can also take many forms, such as hijacking software updates and injecting malicious code into legitimate software, as well as targeting IT and operational technology and hitting every domain and any node, Atwood said.


Moving Toward Smarter Data: Graph Databases and Machine Learning

Data plays a significant role in machine learning, and formatting it in ways that a machine learning algorithm can train on is imperative. Data pipelines were created to address this. A data pipeline is a process through which raw data is extracted from the database (or other data sources), is transformed, and is then loaded into a form that a machine learning algorithm can train and test on. Connected features are those features that are inherent in the topology of the graph. For example, how many edges (i.e., relationships) to other nodes does a specific node have? If many nodes are close together in the graph, a community of nodes may exist there. Some nodes will be part of that community while others may not. If a specific node has many outgoing relationships, that node’s influence on other nodes could be higher, given the right domain and context. Like other features being extracted from the data and used for training and testing, connected features can be extracted by doing a custom query based on the understanding of the problem space. However, given that these patterns can be generalized for all graphs, unsupervised algorithms have been created that extract key information about the topology of your graph data and used as features for training your model.


Dark Side of AI: How to Make Artificial Intelligence Trustworthy

Malicious inputs to AI models can come in the form of adversarial AI, manipulated digital inputs or malicious physical inputs. Adversarial AI may come in the form of socially engineering humans using an AI-generated voice, which can be used for any type of crime and considered a “new” form of phishing. For example, in March of last year, criminals used AI synthetic voice to impersonate a CEO’s voice and demand a fraudulent transfer of $243,000 to their own accounts. Query attacks involve criminals sending queries to organizations’ AI models to figure out how it's working and may come in the form of a black box or white box. Specifically, a black box query attack determines the uncommon, perturbated inputs to use for a desired output, such as financial gain or avoiding detection. Some academics have been able to fool leading translation models by manipulating the output, resulting in an incorrect translation. A white box query attack regenerates a training dataset to reproduce a similar model, which might result in valuable data being stolen. An example of such was when a voice recognition vendor fell victim to a new, foreign vendor counterfeiting their technology and then selling it, which resulted in the foreign vendor being able to capture market share based on stolen IP.


DDoS attacks rise in intensity, sophistication and volume

The total number of attacks increased by over two and a half times during January through June of 2020 compared to the same period in 2019. The increase was felt across all size categories, with the biggest growth happening at opposite ends of the scale – the number of attacks sized 100 Gbps and above grew a whopping 275% and the number of very small attacks, sized 5 Gbps and below, increased by more than 200%. Overall, small attacks sized 5 Gbps and below represented 70% of all attacks mitigated between January and June of 2020. “While large volumetric attacks capture attention and headlines, bad actors increasingly recognise the value of striking at low enough volume to bypass the traffic thresholds that would trigger mitigation to degrade performance or precision target vulnerable infrastructure like a VPN,” said Michael Kaczmarek, Neustar VP of Security Products. “These shifts put every organization with an internet presence at risk of a DDoS attack – a threat that is particularly critical with global workforces reliant on VPNs for remote login. VPN servers are often left vulnerable, making it simple for cybercriminals to take an entire workforce offline with a targeted DDoS attack.”


Group Privacy and Data Trusts: A New Frontier for Data Governance?

The concept of collective privacy shifts the focus from an individual controlling their privacy rights, to a group or a community having data rights as a whole. In the age of Big Data analytics, the NPD Report does well to discuss the risks of collective privacy harms to groups of people or communities. It is essential to look beyond traditional notions of privacy centered around an individual, as Big Data analytical tools rarely focus on individuals, but on drawing insights at the group level, or on “the crowd” of technology users. In a revealing example from 2013, data processors who accessed New York City’s taxi trip data (including trip dates and times) were able to infer with a degree of accuracy whether a taxi driver was a devout Muslim or not, even though data on the taxi licenses and medallion numbers had been anonymised. Data processors linked pauses in taxi trips with adherence to regularly timed prayer timings to arrive at their conclusion. Such findings and classifications may result in heightened surveillance or discrimination for such groups or communities as a whole. .... It might be in the interest of such a community to keep details about their ailment and residence private, as even anonymised data pointing to their general whereabouts could lead to harassment and the violation of their privacy.


Analysis: Online Attacks Hit Education Sector Worldwide

The U.S. faces a rise in distributed denial-of-service attacks, while Europe is seeing an increase in information disclosures attempts - many of them resulting from ransomware incidents, the researchers say. Meanwhile, in Asia, cybercriminals are taking advantage of vulnerabilities in the IT systems that support schools and universities to wage a variety of attacks. DDoS and other attacks are surging because threat actors see an opportunity to disrupt schools resuming online education and potentially earn a ransom for ending an attack, according to Check Point and other security researchers. "Distributed denial-of-service attacks are on the rise and a major cause of network downtime," the new Check Point report notes. "Whether executed by hacktivists to draw attention to a cause, fraudsters trying to illegally obtain data or funds or a result of geopolitical events, DDoS attacks are a destructive cyber weapon. Beyond education and research, organizations from across all sectors face such attacks daily." In the U.S., the Cybersecurity and Infrastructure Security Agency has warned of an increase in targeted DDoS attacks against financial organizations and government agencies



Quote for the day:

"One of the most sincere forms of respect is actually listening to what another has to say." -- Bryant H. McGill

Daily Tech Digest - July 16, 2020

The Advancements In Real World Artificial Intelligence

Ongoing advances in artificial intelligence have come essentially in zones where information researchers can copy human recognition capabilities, for example, perceiving objects in pictures or words in acoustic signs. Figuring out how to perceive designs in complex signs, for example, sound streams or pictures, is amazingly incredible—ground-breaking enough that numerous individuals wonder why we aren’t utilizing deep learning procedures everywhere. Pushing ahead, as groups become adjusted in their objectives and techniques for utilizing AI to accomplish them, deep learning will turn out to be a piece of each data scientist’s tool box. Consider this thought. We will have the option to incorporate object recognition in a framework, utilizing a pre-prepared artificial reasoning framework. However, at long last, we will understand that profound learning is simply another tool to utilize when it makes sense. Now let’s explore how AI is benefitting the mankind and serving various fields like marketing, finance, banking and so on in the real world. Marketing is a way to glorify your products to attract more customers. In the early 2000s, in the event that we looked through an online store to discover an item without knowing its precise name, it would turn into a nightmare to discover the item.


Using the new normal to break from the past and innovate

Arguably, the big reason for the failure of online sales efforts by traditional automakers was the standard way of selling vehicles as good enough, and the effort and investment required to create an online channel wasn't perceived as worthwhile when no one (aside from Tesla) offered a similar capability. People had been buying cars through dealers for a century, and designing and implementing the technology, relationships, marketing, and execution required to create an effective online sales channel was perceived as throwing money at fixing a process that wasn't broken. A time-tested business model centered around driving customers into an enclosed space full of strangers and ideally getting them to sit in an even smaller space with a stranger, with no idea when that space was last cleaned, suddenly doesn't look that great during a pandemic brought about by a virus that spreads primarily through human proximity. Suddenly, dealer networks that saw vehicle delivery as "too expensive" and online or phone purchasing as distractions were able to implement these practices in a matter of weeks.


Businesses express concerns around ethical risks for their AI initiatives

“As organizations become more invested in AI, it is imperative that they have a common framework, principles and practices for the board, C-suite, enterprise and third-party ecosystem to proactively manage AI risks and build trust with both their business and customers,” said Irfan Saif, principal and AI co-leader, Deloitte & Touche. ”Our study results show that while early adopters of AI are still bullish, their competitive advantage may be waning as barriers to adoption continue to fall and more creative use of the technology grows. “In the era of pervasive AI, where capabilities are readily available, organizations should go beyond efficiency and push boundaries to create new AI-powered products and services to be successful.” — Nitin Mittal, principal and AI co-leader, Deloitte Consulting. As purchasing barriers have dropped and AI is more available, choosing the right technology is more important than ever. Those AI adopters surveyed tend to “buy” their capabilities rather than “build” them. To become smarter consumers, companies should evaluate the landscape, find the most advanced AI and integrate those technologies into their infrastructure.


Tech Sector Job Interviews Assess Anxiety, Not Software Skills

“Technical interviews are feared and hated in the industry, and it turns out that these interview techniques may also be hurting the industry’s ability to find and hire skilled software engineers,” says Chris Parnin, an assistant professor of computer science at NC State and co-author of a paper on the work. “Our study suggests that a lot of well-qualified job candidates are being eliminated because they’re not used to working on a whiteboard in front of an audience.” Technical interviews in the software engineering sector generally take the form of giving a job candidate a problem to solve, then requiring the candidate to write out a solution in code on a whiteboard – explaining each step of the process to an interviewer. Previous research found that many developers in the software engineering community felt the technical interview process was deeply flawed. So the researchers decided to run a study aimed at assessing the effect of the interview process on aspiring software engineers. For this study, researchers conducted technical interviews of 48 computer science undergraduates and graduate students. Half of the study participants were given a conventional technical interview, with an interviewer looking on.


Taking the Pain Out of Buying and Selling Data

Narrative’s SaaS-based application provides a platform to connect buyers and sellers. On the buy side, it helps companies acquire and integrate second- and third-party data, typically for the purpose of AI or analytics. On the sell side, companies that license Narrative’s software have a mechanism for reaching multiple buyers in an orderly and streamlined fashion. There’s a lot of work that goes into buying and using, on both sides of the equation, according to Jordan. There are all the usual questions about the format that the data takes (CSV, Parquet, JSON, etc.), the units of measurement. Once data scientists or analysts have studied a sample of the outside data and decided that it will work for their particular activity, then data engineers are called in to build the ETL pipelines to move the data, which can often take months. On top of the logistical questions, there are legalities that must be taken into account. Buyers and sellers both must take measures to assure that they’re not violating regulations for their particular geography. Finance teams typically gets involved to obtain usage data and make the payments. And if anything changes to the data or the contract, all the engineers, analysts, data scientists, lawyers, and finance folks get to drop whatever they’re doing and revisit the matter.


The Twitter mega-hack. What you need to know

There are a number of ways in which online accounts can get hijacked. These include, for instance: You might have made the mistake of reusing your Twitter password elsewhere on the net. If the other place suffers a data breach, a hacker might try to use that same password against your Twitter account. Two-factor authentication can help protect against this, but the best advice of all is to never reuse passwords; You might have had your password stolen from you via a phishing attack or keylogging malware. Two-factor authentication can also help protect against this. In addition, password managers and security software can also provide a layer of defence; You might have mistakenly told someone your password. Passwords should be secret. It’s hard to believe, however, that someone is big enough buddies with Bill Gates, Kanye West, Uber, and the rest to have had their passwords discussed over a candlelit dinner; Your account could be hijacked by a third-party app that is compromised. If the app had access to your Twitter account it could post tweets without your permission. An attack just like that happened to my Twitter account a few years ago.


How Chase is using AI to update banking

In response to the COVID-19 crisis, the U.S. government launched the Paycheck Protection Program (PPP) a couple of months back to ensure money continues to roll into the workforce — this, in turn, led to significant paperwork for banks, which have had to deal with a mountain of applications. The Small Business Administration (SBA) reportedly had to process 75 years’ worth of loan applications in just two months, which gives some idea as to the scale of this undertaking. Faced with such an unprecedented challenge, one that affected the lives and livelihoods of literally millions of Americans, Chase had to come up with a way of classifying documents that its customers were uploading as part of the PPP application process. It did so with a view toward helping its business banking division and underwriters wade through as many applications as possible. “They needed a way to understand what documents our customers were uploading, which we hadn’t yet tagged every single document as part of our workflow,” Nudelman explained. “So instead, after the fact, we worked with the people building the process and technology to use natural language processing (NLP) to ensure the documents that have been uploaded were tagged appropriately, which helps the underwriters’ ability to process those applications, getting customers their loans faster.”


Are we at the tipping point for global biometric payment card adoption?

Well, according to analysis from Goode Intelligence, there are several hurdles to overcome before biometric payment cards can be shipped to users in their millions – including cost and scheme certification. Despite being hailed as the future-tech solution to end our use of cash and cards, mobile payments haven’t reached anywhere near the expected level of public adoption in the UK. As of 2019, only around 19% of the UK population used mobile payments. Of course, the fact that Apple, giants in the payment app space, launched a physical credit card last year, and that Google is set to follow suit is further proof of the customer demand for bank cards over mobile payments. Therefore, it’s clear that the majority of the population still prefer the ease and familiarity of contactless cards. In fact, IDEX research found that six-in-ten (60%) UK consumers would not give up their debit card in favour of mobile payments, so it’s crucial that banks continue to evolve smart bank cards for the next generation of payments. Of course, cost caused by the manufacturing complexity of biometric payment cards has long been seen as the main barrier to mass adoption. 


Open Data Institute releases funding for ethical data sharing projects

Open Data Manchester (ODM) is also set to receive funding, but differs in its focus on helping hundreds of small-scale energy and eco-efficiency cooperatives share data among their members. “With regards to data, cooperatives are in quite a unique space because they’re intrinsically democratic organisations, so there will be some kind of representation or governance process where every member’s view should be represented at a board level, which means that you’ve got already got an environment of enhanced trust,” said Julian Tait, chief executive at ODM, adding that the relationship most people have with their current energy providers is “slightly begrudging” and one of “general dislike”. “If you’ve got an environment where you’re sharing data within the cooperative, they can understand my energy requirements [and]… you can start to design more responsive energy systems – that’s a bit harder to do, or it’s done very opaquely, in regards to the large energy providers.” He added the funding will help ODM work with Carbon Cooperative to design how a data cooperative could look.


Establishing Change Agents within Organisations Using Shu-Ha-Ri

How this can help us to achieve mastery of agile or business agility can be explained with a simple example. Let us take stand-ups, for instance. Shu: We need to make sure that teams start doing stand-ups and communicate the three basics of the stand up: what was done yesterday, what will be done today, and are there any impediments? We need to make sure that teams continue to follow this until they become good at it. Ha: At this stage, teams can come up with certain deviations, like adding "any other business" as a fourth thing or completely changing it to walking the board style to fulfill their requirements. Ri: This is the stage where the flow of information happens naturally and teams do not even need to think before doing stand-ups. This is the stage where this becomes an in-built thing for the team. So with these learning stages or paths, we can see organizations leaning towards agility by getting into the heart of agile by first collaborating to understand the vision and motive, then delivering with actual intent, and then introspecting and improving based on their needs. And when people in an organization start reaching towards the Ri stage, they are then ready to do different things.



Quote for the day:

"Leadership involves finding a parade and getting in front of it." -- John Naisbitt

Daily Tech Digest - May 20, 2020

How IT and Security Leaders Are Addressing the Current Social & Economic Landscape


Despite the security and overall organizational preparedness concerns, IT and security leaders share some notes of encouragement. The majority (68%) of IT leaders agree that their technology infrastructure was prepared to adequately address employees working from home. On an even brighter note, 81% of security leaders believe that their existing security infrastructure can adequately address the current working from home demands, and 67% feel that their security infrastructure is fully prepared to handle the range of risks associated. As more and more individuals are getting their jobs done from home, 71% of IT leaders say that the current situation has created a more positive view of remote workplace policies and will likely impact how they plan for office space, tech staffing and overall staffing in the future. In order to address the new work environment due to COVID-19, 44% of IT leaders will need to acquire new technology solutions and services.



Hackers Hit Food Supply Company

DarkOwl said its analysis shows the attackers have managed to steal some 2,600 files from Sherwood. The stolen data includes cash-flow analysis, distributor data, business insurance content, and vendor information. Included in the dataset are scanned images of driver's licenses of people in Sherwood's distribution network. The threat actors posted screen shots of a chat they had with Coveware, a ransomware mitigation firm that Sherwood had hired to help deal with the crisis. The conversation shows that Sherwood has been dealing with the attack since at least May 3rd , according to DarkOwl's research. The screenshots also suggest that Sherwood at one point was willing to pay $4.25 million and later $7.5 million to get its data back. In an emailed statement, a Sherwood spokeswoman said the company does not comment on active criminal investigations. ... According to DarkOwl, on Monday the attackers updated Happy Blog with news of their plan to next auction off personal data belonging to Madonna.


5 Ways to Detect Application Security Vulnerabilities Sooner to Reduce Costs and Risk

appsec
Human error is always a security concern, especially when it comes to credentials. Just consider how many times you’ve heard of developers committing code only to later realize they’d accidentally included a password. These errors can lead to high-cost consequences for organizations. There are many tools that scan for secrets and credentials that can be accidentally committed to a source code repository. One example is Microsoft Credential Scanner (CredScan). Perform this scan in the PR/CI build to identify the issue as soon as it happens so they can be changed before this becomes a problem. Once an application is deployed, you can continue to scan for vulnerabilities through the following automated continuous delivery pipeline capabilities. Unlike SAST, which looks for potential security vulnerabilities by examining an application from the inside—at the source code—Dynamic Application Security Testing (DAST) looks at the application while it is running to identify any potential vulnerabilities that a hacker could exploit.


MySQL DB
For me, it is that asynchronous programming is such a paradigm shift in a system architecture that it should be analyzed very differently from a “synchronous” system. We analyzed response times but never thought how many concurrent requests there would be at any point because, in a synchronous system, the calling system is itself limited in how many concurrent calls it can generate, because of threads getting blocked for every request. This is not true for asynchronous systems, and hence a different mental model is required to understand causes and outcomes. Any large software system (especially in the current environment of dependent microservices) is essentially a data flow pipeline and any attempt to scale which does not expand the most bottlenecked part of the pipeline is useless in increasing data flow. We thought of pushing a huge amount of data through our pipeline by making Armor alone asynchronous and failed to distinguish between a matter of Speed (doing this faster) from a matter of Volume (doing a lot of it at the same time).


The downside of resilient leadership


Where does resilience come from? It’s a muscle that can be developed early on through a strong family life or a mentor relationship, or from positive experiences that help ready children and young adults for life’s tests in later years. But resilience is often also forged at young ages through adverse experiences that force children to rely on what psychologists call an “internal locus of control,” a concept developed in the 1950s by American psychologist Julian Rotter. When challenged, these young people decide that they are going to be in charge of their own fate and not let their circumstances define them. ... One of the messages these future leaders told themselves, or that was hammered into them by a parent, was “don’t be a victim.” Nobody would wish tough circumstances on another person, and yet it was in the moments of being tested that they discovered what they were made of. Adversity built a quiet confidence in them, because they went through tough times and knew they could do it again.


Why the cloud journey is hard

Cloud-journey
Cloud journey- Conway’s Law states: “The structure of any system designed by an organisation is isomorphic to the structure of the organisation,” which means software or automated systems end up shaped like the organisational structure they’re designed in or designed for, according to Wikipedia. This could be why some organisations find it difficult to fully embrace cloud adoption as certain legacy organisational structures just don’t fit into a more demanding agile oriented cloud environment. Nico Coetzee, Enterprise Architect for Cloud Adoption and Modern IT Architecture at Ovations, elaborates: “Every company that embarks on its cloud journey can count on some deliverables not going as planned. There are many reasons for the failure of certain modernisation projects and cloud journeys, but it might come as a surprise to hear that the most common reason could be as simple as traditional structures.” If we go back to Melvin E Conway’s research on ‘How do committees invent?’ from 1967, there are some key insights.


Executive AI Fluency – Ending the Cycle of Failed AI Proof-of-Concept Projects

Executive AI Fluency
Executives cannot understand AI in a purely conceptual fashion. They need practical use-cases for the types of AI projects they are brainstorming – and it is even better (at least initially) to have examples within their industry or related industries. One example of a strong AI use-case in banking is fraud detection. Some banks and AI vendors report to have lowered their rate of false-positive results for financial fraud using predictive analytics solutions. A wide range of use cases allows leadership to better detect where AI opportunities might lie within the company and decide which projects deserve the most attention of the many that could be applied. Banking leaders should be able to expect a chatbot solution to provide their customers basic answers to common and simple questions. Bank leadership should not expect their chatbot to be able to handle complex conversations, or draw upon rich context from previous email or phone conversations with the client. The technology is simply not at that level today. In this way working with AI is more strategic than the “plug and play” nature of IT solutions.


US Treasury Warning: Beware of COVID-19 Financial Fraud

US Treasury Warning: Beware of COVID-19 Financial Fraud
FinCEN notes that medical-related fraud scams, including fake cures, tests, vaccines and services, may require customers to pay via a pre-paid card instead of a credit card; require the use of a money services business or convertible virtual currency; or require that the buyer send funds via an electronic funds transfer to a high-risk jurisdiction. The agency notes that scams involving nondelivery of medical-related goods often occur through websites, robocalls or on the darknet. Scams involving price gouging include cases where individuals have been selling surplus items or newly acquired bulk shipments of goods - such as masks, disposable gloves, isopropyl alcohol, disinfectants, hand sanitizers, toilet paper and other paper products - at inflated prices, FinCEN explains. "Payment methods vary by scheme and can include the use of pre-paid cards, money services businesses, credit card transactions, wire transactions, or electronic fund transfers," it notes. ... "FinCEN is correct in its assertion that there will be a huge increase in all types of cybercrimes, especially related to medical scams and related cyberattacks, says former FBI agent Jason G. Weiss


How the UK pensions industry is paving the way for open data sharing ecosystems

The UK pensions industry and the rise of open data sharing ecosystems image
While some questions remain over how the regulatory standards from the pensions dashboard and Open Banking (a separate regulation focused on building transparency and open sharing into the banking industry) can be applied to a wider Open Finance initiative, the pension dashboard’s architecture — federated digital identity, UMA, and interoperability through secure Open APIs — provides a viable model for Open Finance. Crucially, these technologies conform to open standards, meaning the architecture that underpins them can be updated and synced with any new technology, preventing the formation of any legacy systems and allowing for consistent innovation. When adopted across the financial services ecosystem, they would create a variety of secure, trustworthy, and user-friendly tools that would empower users to engage more meaningfully with their finances. Picture it: financial advisors and brokers could deliver important financial advice more completely, immediately, and visibly through the kind of seamless user experiences that are currently the preserve of digital native sectors.


NCSC discloses multiple vulnerabilities in contact-tracing app

The encryption vulnerability in the beta app has arisen because the app does not encrypt proximity contact event data, and the data is not independently encrypted before it is sent to the central servers. This, said Levy, means that when data is transferred to the back-end, it is only protected by the transport layer security (TLS) protocol, so that if Cloudflare was compromised in some way, cyber criminals could access that data. He pointed out that this was something else that was sacrificed at first because of the need for speed. Finally, Levy noted some ambiguities and errors in statements made about the beta app. Among these was a statement that “the infrastructure provider and the healthcare service can be assumed to be the same entity”. This suggests that the NCSC trusts the network bridging the gap between user devices and the central NHS servers in the same way as it trusts the whole of the NHS, which is clearly not the case.



Quote for the day:


"You must learn to rule. It's something none of your ancestors learned." -- Frank Herbert


Daily Tech Digest - September 16, 2019

What is Computer Vision And The Amazing Ways It’s Used In Business

What is Computer Vision And The Amazing Ways It's Used In Business
Many car manufacturers from Ford to Tesla are scrambling to get their version of the autonomous vehicle into mass production. Computer vision is a critical technology that makes autonomous vehicles possible. The systems on autonomous vehicles continuously process visual data from road signs to seeing vehicles and pedestrians on the road and then determine what action to take. Computer vision in medicine helps in diagnosing disease and other ailments and extends the sight of surgeons during operations. There are now smartphone apps that allow you to diagnose skin condition using the phone's camera. In fact, 90 percent of all medial data is image-based—X-rays, scans, etc. and a lot of this data can now be analyzed using algorithms. Digital marketing: By using computers to sort and analyze through millions of online images, marketers can bypass traditional demographic research and still target marketing to the right online audience and do this work dramatically quicker than humans could. Marketers even use computer vision to ensure ads are not placed near content that is contradictory or problematic for its audience.



Research explores economic benefits of full-fibre and 5G at local level


“Knowledge-intensive sectors are shown to benefit most,” said the report. “Education and health sectors have also been shown to experience larger-than-average productivity impacts of increased connectivity… [and] there is also a likelihood for full-fibre and 5G in particular to lead to productivity improvements in industrial and manufacturing settings.” Therefore, an area with a particularly high density of knowledge workers will benefit more than area with a relatively low density. Likewise, an area with a high concentration of manufacturing businesses, such as the West Midlands, where the UK’s first regional testbed for 5G is taking place, will benefit more than an area with a low concentration. “Many reports already estimate the benefits that full-fibre and 5G can bring to the UK economy,” said BSG CEO Matthew Evans. “But what does it mean for Manchester, Merthyr Tydfil or the Midlothian hills?


Brain hack devices must be scrutinised, say top scientists

Neurons in brain
 In future, "people could become telepathic to some degree" and being able to read someone else's thoughts raises ethical issues, experts said. This could become especially worrying if those thoughts were shared with corporations. ... Among the risks highlighted by the report was the idea of thoughts or moods being accessed by big corporations as well as the bigger question about whether such devices fundamentally change what it means to be human. Dr Tim Constandinou, director of the next generation neural Interfaces (NGNI) Lab, at Imperial College London and co-chair of the report, said: "By 2040 neural interfaces are likely to be an established option to enable people to walk after paralysis and tackle treatment-resistant depression, they may even have made treating Alzheimer's disease a reality. "While advances like seamless brain-to-computer communication seem a much more distant possibility, we should act now to ensure our ethical and regulatory safeguards are flexible enough for any future development. "In this way we can guarantee these emerging technologies are implemented safely and for the benefit of humanity."


Microservices Migration Use Cases

By migrating to microservices IT will enable your teams to become more innovative as they are freed up from daily mundane tasks supporting and developing on a legacy system that simply cannot compete in the competitive world we are in today. The other primary benefit customers see is scale — an elastic environment that allows your business to auto-scale takes the worry out of slow performance during critical events or peak traffic seasons. This could be a retail outlet during Black Friday/Cyber Monday, or an insurance company during a natural disaster or macro-economic changes that cause a flurry of activity on Wall Street. We create value on mobile apps with external development providing an entry point to enter the data center and consume our APIs. We empower from hundreds to thousands of microservices to happen with a self-service platform for developers to publish new services and new versions as needed. All of this is automated allowing the platform team to set boundaries on what teams can do.


It’s not easy going green – but the Internet of Things can help


Only through cross-system communication is compliance and energy efficiency possible. Much of the IoT’s value lies in its ability to integrate the various, complex components and IT systems that make up any modern building or facility. When building systems can ‘talk’ with each other, the resilience of the infrastructure is strengthened. This provides access to a greater volume of intelligence, leading to more robust compliance and better use of resources. An IoT-connected system enhances an organisation’s pursuit of greater energy efficiency, where the rapid collection of, and reaction to, massive amounts of information is essential. For example, having IoT devices and sensors integrated with a heating, ventilation and air conditioning system means that organisations can collect real-time data on energy consumption and device health. Armed with this information, organisations are empowered to take a fresh look at their current practices, generate business change and create efficiencies that cut costs and emissions. From an energy management perspective, Schneider Electric’s PowerLogic ION9000 is the ideal connected solution.


Open source and open data

First and foremost, our primary mission is “to organize the world’s information and make it universally accessible and useful.” Certainly one obvious way to make information universally accessible and useful is to give it away! Second, making these materials available stimulates scientific research outside of Google. We know we can’t do it all, and we spend a lot of time reading, understanding and often extending work done by others, some of which has been developed using tools and data we have provided to the research community. This mix of competition and cooperation among groups of researchers is what pushes science forward. Third, when we hire new employees, it’s great if they can hit the ground running and already know and use the tools we have developed. Familiarity with our software and data makes engineers productive from their first day at work. There are many more reasons to share research data, but these three alone justify the practice. We aren’t the only internet company to appreciate the power of open data, code, and open research.


New NetCAT CPU side-channel vulnerability exploitable over the network

Hands typing on a laptop keyboard binary code and a hazard symbol on screen.
The culprit is Intel’s Data Direct I/O (DDIO) technology, which gives peripheral devices such as network cards direct access to the processor’s internal cache to achieve better performance, less power consumption, and higher data throughput. Before DDIO, these devices exchanged data with the CPU through RAM, whose latency can be a bottleneck. DDIO was designed with ethernet controllers and fast datacenter networks in mind to allow servers to handle 10-gigabit ethernet (10 GbE) connections and higher. The technology was first introduced in 2011 in the Intel Xeon E5 and Intel Xeon E7 v2 enterprise-level processor families. CPU attacks like Spectre and Meltdown and their many variants have used the CPU cache as a side-channel to infer sensitive data. Researchers from the VUSec group at Vrije Universiteit Amsterdam have now shown that DDIO’s cache access can be exploited in a similar manner. In a new paper released today, the researchers described an attacked dubbed NetCAT which abuses DDIO over the network to monitor access times in the CPU cache triggered by other clients connected to the same server over SSH (Secure Shell).


NHSX emphasises need for ethical patient data access


“NHS and care organisations have an obligation to protect patient data, but in my view, they also have the obligation to make best use of it,” she said. “Collaborations need to benefit everyone involved – patient lives are at stake.” Donnelly also mentioned that “citizen juries” are currently taking place to debate the matter of how patient data should be used what constitutes a fair partnership between the NHS and researchers, charities and industry on uses of patient and operational data from the NHS. “By testing different commercial models against the principles on which our citizens are not prepared to compromise, we hope to reach a consensus on what good looks like and how best we achieve the promised benefits.” In July, a programme was launched by Public Health England and NHSX with the aim to usher in a “new era of evidence-based self-care”, with patients increasingly expected to allow access to their personal data.


Gartner sees blockchain as ‘transformational’ across industries – in 5 to 10 years

Chains of binary data.
"Once it has been combined with the Internet of Things (IoT) and artificial intelligence (AI), blockchain has the potential to change retail business models forever, impacting both data and monetary flows and avoiding centralization of market power," Gartner said. As a result, Gartner believes that blockchain has the potential to transform business models across all industries — but the opportunities demand that enterprises adopt complete blockchain ecosystems. Without tokenization and decentralization, most industries will not see real business value. The journey to create a multi-company blockchain consortium is inherently awkward, Garter said. "Making wholesale changes to decades-old enterprise methodologies is hard to achieve in any situation. However, the transformative nature of blockchain works across multiple levels simultaneously (process, operating model, business strategy and industry structure), and depends on coordinated action across multiple companies."


Rethinking Flink’s APIs for a Unified Data Processing Framework


Flink’s existing API stack consists of the Runtime as the lowest level abstraction of the system that is responsible for deploying jobs and running Tasks on distributed machines. It provides fault-tolerance and network interconnection between the different Tasks in the JobGraph. On top of Flink’s Runtime sit two separate APIs, the DataSet and DataStream APIs. The DataSet API has its own DAG (directed acyclic graph) representation for tying together the operators of a job, as well as operator implementations for different types of user-defined functions. The DataStream API has a different DAG representation as well as its own set of operator implementations. Both types of operators are implemented on a disjointed set of Tasks which are given to the lower-level Runtime for execution. Finally, we have the Table API / SQL which supports declarative-style programming and comes with its own representation of logical operations and with two different translation paths for converting Table API programs to either the DataSet or DataStream API, depending on the use case and/or the type of sources that the program comes with.



Quote for the day:


"Courage is the ability to execute tasks and assignments without fear or intimidation." -- Jaachynma N.E. Agu


Daily Tech Digest - July 08, 2018

Why Big Data and AI are the Next Digital Disruptions?

Big Data, Artificial Intelligence, AI, TechNews, tech news
Big data and Artificial Intelligence are two inextricably linked technologies, to the point that we can talk about Big Data Intelligence. Artificial Intelligence has become ubiquitous in companies in all industries where decision making is transformed by intelligent machines. The need for smarter decisions and Big Data management are the criteria that drive this trend. The convergence between Big Data and AI seems inevitable as the automation of smart decision-making becomes the next evolution of Big Data. Rising agility, smarter business processes and higher productivity are the most likely benefits of this convergence. The evolution of data management did not go smoothly. Much of the data is now stored on a computer, but there is still a lot of information on paper, despite the possibility of scanning paper information and storing it on disks or in databases. You just have to go to a hospital, an administration, a doctor’s office or any business to realize that a lot of information about customers, vendors, or products is still stored on paper. However, it is impossible to store terabytes of data produced by streaming video, text and images on paper.



Why today’s leaders need to know about the power of narratives

Effective narratives articulate the 'why' - a higher purpose or common goal that helps form a shared identity
Effective narratives are defined by two characteristics. Firstly, they articulate the "why" - a higher purpose or common goal that helps actors overcome vested interests and form a shared identity. The first line in Satoshi Nakamoto’s eminent white paper that launched Bitcoin describes how "a purely peer-to-peer version of electronic cash would allow online payments to be sent directly from one party to another without going through a financial institution". Secondly, effective narratives establish cause-effect relationships that form the basis for working towards this goal. Chinese electric vehicle manufacturers and Tesla are rivals in retail markets, but also partners in propagating the idea that electric passenger vehicles are the best means for lowering carbon emissions. Narratives interact with the real world in that actors combine normative beliefs (the "why") and positive beliefs (the "how") into decisions which result in perceived outcomes that potentially trigger a change of the narrative itself. As such, narratives are categorically different from stories. Stories are self-contained, whereas narratives are open-ended.


Announcing Microsoft Research Open Data – Datasets

The goal is to provide a simple platform to Microsoft researchers and collaborators to share datasets and related research technologies and tools. Microsoft Research Open Data is designed to simplify access to these datasets, facilitate collaboration between researchers using cloud-based resources and enable reproducibility of research. We will continue to shape and grow this repository and add features based on feedback from the community. We recognize that there are dozens of data repositories already in use by researchers and expect that the capabilities of this repository will augment existing efforts. ... Datasets in Microsoft Research Open Data are categorized by their primary research area, as shown in Figure 4. You can find links to research projects or publications with the dataset. You can browse available datasets and download them or copy them directly to an Azure subscription through an automated workflow. To the extent possible, the repository meets the highest standards for data sharing to ensure that datasets are findable, accessible, interoperable and reusable; the entire corpus does not contain personally identifiable information. The site will continue to evolve as we get feedback from users.


Augmented Reality in Manufacturing is Ready for Its Closeup

dhl-vision-picking-06 (1).png
Virtual/Augmented reality (VR and AR) — using technology to see something that literally is not there — is coming to a manufacturing facility near you. It's actually already there, but according to PwC, more than one in three manufacturers will implement virtual or augmented reality in manufacturing processes in 2018. Perhaps it will be something relatively simple, like what logistics giant DHL recently accomplished by introducing "Vision Picking," pilot programs of workers wearing smart glasses with visual displays of order picking instructions along with information on where items are located and where they need to be placed on a cart. The smartglasses freed pickers' hands of paper instructions and allowed them to work more efficiently and comfortably. ... "Digitalization is not just a vision or program for us at DHL Supply Chain, it's a reality for us and our customers, and is adding value to our operations on the ground," says Markus Voss, Chief Information Officer & Chief Operating Officer, DHL Supply Chain. 


The Golden Record: Explained


Where ‘big data’ appears to be the skeleton key that will unlock everything and all you want to know about your business, there’s more than meets the eye when it comes to understanding your data. Yes, clean data will unlock incredible value for your enterprise; inaccurate records, on the other hand, are a significant burden on our productivity. This is why we all seek the “Golden Record”. The Golden Record is the ultimate prize in the data world. A fundamental concept within Master Data Management (MDM) defined as the single source of truth; one data point that captures all the necessary information we need to know about a member, a resource, or an item in our catalogue – assumed to be 100% accurate. Its power is undeniable.  ... The complexity of implementing a Master Data Management solution stems from defining the workflow that will connect our disparate data sets. First, we have to identify every data source that feeds into the dataset. Then, we must consider which fields we find to be the most reliable depending on their source. Finally, we must define the criteria that will determine when the data from one source should overwrite conflicting data from a secondary source in our MDM system.



Some IoT experts, taking a practical view, think the only requirements at the end-points should be to deliver secure identity and no other complexity.  Amir Haleem, CEO of Helium, which is building a decentralized network of wide-range wireless protocol gateways and a token to connect edge IoT devices, said adding complexity to end devices"is like a gigantic hurdle to people actually building things." Apart from anything else, there's the cost. "People get very sensitive about the bill of materials (BoM) when you start talking at a scale of millions or tens of millions," said Haleem. "You start proposing like a 60 cent addition to a BoM and all of a sudden that's a meaningful number." Haleem said it makes no sense for end devices, like sensors that track and monitor medicine or food supply chains, to actively participate in a blockchain because these have to be power-efficient and cheap in an IoT setting. But delivering strong identity in the form of hardware-secured keys is essential, particularly in the face of recurring widespread vulnerabilities, botnets etc.


Can we have ethical artificial intelligence?

i-robot-film-still-main
“Generally, the idea that needs to be adopted by the industry is an ethical design right from the very start. So, it’s no longer useful just to have ethical approval of a system once it’s done and deployed – it has to be considered from the beginning and it has to be continuously considered.” It’s clear that the problem with intelligent machines is people. Without careful checks and balances, we could find ourselves using data that is inherently biased to feed machines which would themselves become biased. And without serious consideration and action, we might also find ourselves at the whim of corporations and governments. Francois Chollet, an artificial intelligence researcher in Google wrote in a recent blog post that AI poses a threat given the possibility of ‘highly effective, highly scalable manipulation of human behaviour.’ He also stated that continued digitization gives social media companies an ever-increasing insight into our minds, and ‘casts human behaviour as an optimization problem, as an AI problem: it becomes possible for [them] to iteratively tune their control vectors in order to achieve specific behaviours.’


The Answer to Disruptive Technology is “Education”

When disruptive technologies are addressed in education, they are usually considered in isolation. I increasingly come across discussions about “artificial intelligence,” “blockchain,” or “robots.” But the world is revolving more and more around these technologies working together. Disruptive technologies are accelerating each other’s development, creating new societal, economic, legal and commercial realities. For instance, disruptive digital technologies (operating together) are transforming the way business works. Instead of hierarchical and asset-heavy companies, we see flatter organizations/platforms with fewer assets and employees. Coordination of the assets and workers isn’t done by traditional managers, but digital technologies, sensors, and data analytics. Some even predict the end of the firm. ... Disruptors create growth by redefining performance that either brings a simple, cheap solution to the low end of a traditional market or enables “non-consumers” to solve pressing problems in their everyday lives. Employing “old world” ideas seems unlikely to work when pursuing the new.


8 Deep Learning Frameworks for Data Science Enthusiasts

The machine learning paradigm is continuously evolving. The key is to shift towards developing machine learning models that run on mobile in order to make applications smarter and far more intelligent. Deep learning is what makes solving complex problems possible. As put in ​this ​article, Deep Learning is basically Machine Learning on steroids. There are multiple layers to process features, and generally, each layer extracts some piece of valuable information. Given that deep learning is the key to executing tasks of a higher level of sophistication – building and deploying them successfully proves to be quite the Herculean challenge for data scientists and data engineers across the globe. Today, we have a myriad of frameworks at our disposal that allows us to develop tools that can offer a better level of abstraction along with the simplification of difficult programming challenges. Each framework is built in a different manner for different purposes. Here, we look at the 8 deep learning frameworks to give you a better idea of which framework will be the perfect fit or come handy in solving your business challenges.


Process Simulation with the Free DARL Online Service

Trading is about transferring funds from one financial instrument to another and back again, in such a way as to have more of the first financial instrument when you've finished. In this case, the two financial instruments we will use are the pound sterling and the dollar. I'm going to start with a simulated £10,000 and trade in and out of the dollar. ... This data contains a date and the open, close, high and low exchange rates for each day. We're going to simulate trading at the close. In the file, this value is called "price". When you trade anything through an exchange, or use a high street foreign exchange, there are two sources of cost. There's normally a transaction fee and there's a "spread" which is an offset to the central rate. These are the sources of guaranteed profit to the brokers. Our simulation will have values for both of these. Simulating trading will require us to respond to a trading signal and to buy whichever currency we are told, to calculate and subtract charges, and to keep track of the value of our holding. In trading parlance, in currency trading, you can be "long" or "short" a particular currency. If we are holding sterling, we are long sterling and short the dollar, if we are holding dollars we are short sterling and long dollar.



Quote for the day:


"Leaders think and talk about the solutions. Followers think and talk about the problems." -- Brian Tracy


Daily Tech Digest - December 27, 2017

Cloudops automation is the only answer to cloud complexity

Cloudops automation is the only answer to cloud complexity
The fact is that most enterprise deal with cloud operations—aka cloudops—using the native tools of their cloud providers. Although that is scalable when you’re just using one public cloud for everything, the reality is that you have to manage traditional systems built within the last 20 years, multiple public clouds, perhaps a private cloud, IoT devices, and data that exists everywhere (with no single source of truth). In other words, a huge mess.  Automation does not save you from having this mess, but it helps a great deal.  First, you need to consider the concept. When you automate cloudops, you’re really looking to remove the complexity by placing an abstraction layers between the complex array of systems,and you the person that needs to operate the technology. ... The trick is focusing on the broader management technology, and the automation that it providers, versus the cloud-native tools that won’t help you beyond a single public cloud.


Unfortunately, while attempting to scale the lofty heights of open data and public goods, the Australian government has again failed to consider privacy implications — something that might be a concern for the 10 percent of Australians that were randomly selected to have their personal data "anonymised" and publicly released. This is no surprise; anonymising data is really, stupendously difficult. Data re-identification issues range from mildly embarrassing to serious and potentially life altering — a quick look at the Australian Medicare MBS data shows how unique some data can be. Ask yourself, what are the consequences of re-identifying the one girl in Queensland aged 5 to 15 who received "Pregnancy Support Counselling Services" — Medicare item number 4001 — in July 2016? What is embarrassing for some could be catastrophic for others.


cyberwar.jpg
There is one key definition of cyberwarfare, which is a digital attack that is so serious it can be seen as the equivalent of a physical attack. To reach this threshold, an attack on computer systems would have to lead to significant destruction or disruption, even loss of life. This is a significant threshold because under international law states are permitted to use force to defend themselves against an armed attack. It follows then that, if a country were hit by a cyberattack of significant scale, they would be within their rights to strike back using their standard military arsenal: to respond to hacking with missile strikes. So far this has never happened -- indeed it's not entirely clear if any attack has ever reached that threshold. That doesn't mean that attacks which fail to reach that level are irrelevant or should be ignored: it just means that the country under attack can't justify resorting to military force to defend itself. 


How Technology Is Changing Leadership

This generation of CEOs is notably different from 10 years ago – and more tech-savvy than their predecessors. As the pace of technology change has picked up, CEOs are seeing new business opportunities but are under pressure to provide a better customer experience based on a new set of technologies ranging from data analytics and IoT to cloud computing and robotic process automation. At the same time, consumers are more demanding and have higher expectations for technology to be part of their lives “If the CEO isn’t thinking about how to leverage these disruptors to help drive top-line growth in their business and products, they are going to be left behind.” For many leaders, these technological innovations are making it possible to create more participatory organizations. Collaboration is the cornerstone of modern leadership. Rather than being “in charge,” collaborative leaders blur the lines between “boss” and “worker” and focus on team building, creative thinking


The 4 Top Security Concerns On The Minds Of Millennials

The 4 Top Security Concerns On The Minds Of Millennials.
Compared to older generations, millennials are more aware of various threats, and are better able to distinguish between different levels of threats online. For example, millennials are about as cautious as baby boomers when it comes to anticipating an online banking cyberattack, with 19 percent of boomers and 14 percent of millennials believing their bank could be breached. But the generations split on social media, where 63 percent of boomers think social media is especially vulnerable to cyberattacks, compared to 45 percent of millennials. Millennials would rather learn about current threats, and increase their knowledge, than work blindly  ...  Finally, millennials tend to be more trusting of external organizations, putting their faith in major brands that have established a reputation for themselves. This makes them less worried and less active when a breach is announced, and makes them more likely to lean on external vendors to solve internal security concerns.


Marketplace for artificial intelligence services emerging in 2020

Executives across many industries are realizing they need to allow users to securely access data efficiently without having to request authorization from multiple systems and to build infrastructure so their teams are fully equipped to handle big data analytics. A hybrid approach allows companies to obtain the cost savings of the cloud while protecting its intellectual property and data on-premises. Digital representations of physical structures, or digital twins, have been used for years in complex 3D renderings. But innovations in data analytics and IoT have pushed advances in 3D modeling to augment business strategies and decision-making in the enterprise. In 2018, more organizations will implement digital twins to visualize complex technologies and achieve new efficiencies with an increasingly digital approach.


Hacking Visual Studio

There is a long history to having an extensible IDE, probably originating from Alan Kay’s SmallTalk design. There is a range of reasons for wanting to add features: You might be tempted to create a custom color scheme, a tool to automate a mundane task or for moving files around. You might even need a complete support for a new programming language!  ... Some features of the IDE are not accessible from these frameworks, but you can always use them if you fully understand the caveats. I’ll mention these later in this article. To use these features, you first may need to take a look on how the creators of the IDE write their code or maybe even call their internal methods! Some of them are written in C++ but the most interesting bits are in .NET so you won’t need to resort to anything more complex than disassembling .NET assemblies.


DDoS attacks increased 91% in 2017 thanks to IoT

istock-482763525.jpg
DDoS-for-hire services have lowered the barriers of entry for criminals to carry out these attacks, in terms of both technical ability and cost, Ashley Stephenson, CEO of Corero, said in a press release. Now, almost anyone can systematically attack and attempt to take down a company for less than $100. And in terms of IoT risks, earlier this year the Reaper botnet targeted known vulnerabilities in IoT devices and hijacked them, including internet-connected webcams, security cameras, and digital video recorders. Each time a device is infected, the device spreads the malware to other vulnerable devices, expanding its reach. "Cyber criminals try to harness more and more Internet-connected devices to build ever larger botnets," Stephenson said in the release. "The potential scale and power of IoT botnets has the ability to create Internet chaos and dire results for target victims."


Cloud migration: How to know if you’re going too fast or too slow

Cloud migration: How to know if you’re going too fast or too slow
Although a few enterprises are slow to start—and some have to yet to start—their migaations to cloud, many enterprises are blasting forward, with the funding and support to cloud-enable most of their enterprise IT by 2020.  While there may appear to be a party going on and you’ve not been invited, my advice to enterprises is to proceed to the cloud at your own measured place. Indeed, while the growth numbers are impressive, I can’t help but think that some enterprises are moving so fast to the cloud that they are bound to make some very costly mistakes such as not dealing with security, governance, and operations properly for cloud-based systems. I’ve been making a nice living over the last year fixing these.  But the larger danger is that you’re not taking advantage of what public cloud services can offer enterprises IT—and your business.


2018 - Top 10 Transformative Technology Trends


As AI, Intelligent apps and Analytics are getting more embedded into our lifestyle we are beginning to create technology that has started augmenting human lifestyle. We will see designers integrating and augmenting our everyday activities such as seeing (facial recognition), reading (semantic and sentiment analysis), listening/speaking (conversational interfaces e.g. Alexa, Siri) and emotions (affective computing). It has been predicted that we will have 75 billion connected devices by 2020! These devices will not only be connected, but will be cooperating as part of an intelligent eco-system. These smart devices will be constantly gathering data, connecting and sharing in intelligent ways to solve the desired goals of underlying systems. As AI and Intelligent things proliferate, expect to see a swarm of intelligent agents collaborating and replicating interactions in the real world.



Quote for the day:



"One of the tests of leadership is the ability to recognize a problem before it becomes an emergency." -- Arnold H. Glasow