Daily Tech Digest - December 28, 2019

Taiwanese Police Arrest Miner Accused of Stealing Millions in Power
Proof of Work was the original consensus mechanism used by Bitcoin and latterly implemented on the likes of Ethereum, Litecoin, and Dogecoin. PoW involves performing thousands of calculations per second to find the solution to a mathematical problem that is hard to solve but easy to verify. The Proof of Work system incentivizes miners by rewarding them with coins for each new block found. Although it remains an extremely fair and secure consensus mechanism, PoW has been criticized over the years. Much has been made, for its example, of its high energy and resource requirements: the computational power needed for miners to solve complex mathematical puzzles ahead of their peers is huge. Critics lose sight of the fact that this is a feature and not a bug: the difficulty of cheating Proof of Work is what makes it so robust, and why the Bitcoin network is so valuable. Even the most well funded adversary would struggle to obtain the hashpower necessary to control the network and double spend coins.


DevOps in the enterprise requires focus on security, visibility

In this episode of Test & Release, Pariseau, who writes for SearchSoftwareQuality and SearchITOperations, discusses technology topics that will matter in 2020. She also shares experiences from containers, cloud and DevOps conferences such as KubeCon and DevSecCon, where diverse leaders related the many challenges associated with DevOps and Agile transformation. Success for DevOps in the enterprise starts with small wins and a consistent march toward improvement. "It's clear that enterprises have had to handle this digital transformation in phases," Pariseau said. "You have to eat the elephant one bite at a time." Take security in the SDLC. DevOps purists, she says, intended for business and security concerns to get rolled into the natural cadence of a lifecycle. However, as many teams struggle with pipeline complexities bringing DevOps to mainstream enterprise IT, those concerns took a back seat. Now, enterprises are putting security back into focus, as high-profile breaches carry potentially disastrous repercussions.


A decade of fintech megatrends
Forecasts that this sector will cross the $25 billion mark by 2025 seem grossly inadequate to me. Libra has awoken central banks, policy makers and regulators with the likelihood that a dominant global industry led stablecoin may emerge. The FSB, BIS, and IOSCO are all focused on analysing the market impact of stablecoins and central banks are reviewing their plans for digital fiat currencies. Libra may have fumbled in the early days with its own narrative, but its impact has been sensational. Following the ICO crash and pullback of the bitcoin price in 2018 the sector has regrouped with an enterprise focus: new digital assets and derivatives, and a focus on exchange, custody and settlement infrastructure. Market leaders include R3 with its Corda platform and Six the Swiss stock exchange, who will partner to platform digital assets; a JP Morgan Coin for client payments; and Fidelity Digital Assets platform for institutional clients. After Xi Jinping's comments expect the Chinese government to push the development of blockchain technology, ahead of the application of cryptocurrencies which are banned in China.


Remme technology reduces passwords and human failure to present a high-end security system that is simple to use without jeopardizing security. REMME solves the issue of central servers that can be hacked, as well as restricting attacks, such as phishing, server, and password violation, and password reuse attacks, with the help of blockchain. Users can utilize the free version of the system in some 10,000 logins per month. Up to 100,000 logins per month can be used for $199. It’s inexpensive as compared to its competitors. Remme is headquartered in Ukraine. Remme has been in existence since 2015, and its name is becoming famous in the industry. It serves a wide range of businesses, but most companies have to safeguard their clients’ sensitive data. But, anyone can use it, including small organizations and individuals. Remme has two essential strengths. Firstly, it uses new technology that is hack-proof, so it guarantees client data security and avoids any possible damages or losses.


Tesla describes its solution in the patent application: “A data pipeline that extracts and provides sensor data as separate components to a deep learning network for autonomous driving is disclosed. In some embodiments, autonomous driving is implemented using a deep learning network and input data received from sensors. For example, sensors affixed to a vehicle provide real-time sensor data, such as vision, radar, and ultrasonic data, of the vehicle’s surrounding environment to a neural network for determining vehicle control responses. In some embodiments, the network is implemented using multiple layers. The sensor data is extracted into two or more different data components based on the signal information of the data. For example, feature and/or edge data may be extracted separate from global data such as global illumination data into different data components. The different data components retain the targeted relevant data, for example, data that will eventually be used to identify edges and other features by a deep learning network. ..."


The Year of Magecart: How the E-Commerce Raiders Reigned in 2019

While the retail giant notified customers on Nov. 15, the company has yet to release details of the attack. For example, hHow many customers were impacted by the breach remains unknown. Researchers, however, believe the intruders belong to a loose grouping of cybercriminal gangs known as Magecart groups, named for their habit of skimming financial details from shopping carts and, often, the Magento e-commerce platform. This particular group had upped its game: The attackers had tightly integrated their information-gathering code into two parts of the website and had knowledge of how Macy's e-commerce site functioned, security firm RiskIQ said in a Dec. 19 analysis. "The nature of this attack, including the makeup of the skimmer and the skills of the operatives, was truly unique," said Yonathan Klijnsma, head researcher with RiskIQ, in his analysis. "I've never seen a skimmer so meticulously constructed and able to play to the functionality of the target website." The Macy's breach is the latest success for the broad class of Magecart attackers.


In its traditional configuration using value functions or policy search the RL algorithm essentially conducts a completely random search of the state space to find an optimum solution. The fact that it is in fact a random search accounts for the extremely large compute requirement for training. The more sequential steps in the learning process, the greater the search and compute requirement. The new upside down approach introduces gradient descent from supervised learning which promises to make training orders of magnitude more efficient. Using rewards as inputs, UDRL observes commands as a combination of desired rewards and time horizons. For example “get so much reward within so much time” and then “get even more reward within even less time”. As in traditional RL UDRL learns by simply interacting with its state space except that these unique commands now create learning based on gradient descent using these self-generated commands. In short this means training occurs against trials that were previously considered successful (gradient descent) as opposed to completely random exploration.


The patent, granted earlier this month after being filed back in March 2015, outlines a system that allows users to make bitcoin payments using an email address linked to a cryptocurrency wallet. "Bitcoin can be sent to an email address," the patent filing read, detailing the advantages of the technology. "No miner's fee is paid by a host computer system. Instant exchange allows for merchants and customers to lock in a local currency price. A tip button rewards content creators for their efforts. A bitcoin exchange allows for users to set prices that they are willing to sell or buy bitcoin and execute such trades." However, the system takes 48-hours for the transaction to clear once the receiver has confirmed the payment and there doesn't appear to be support for other major cryptocurrencies. The technology could mean a big step for mainstream adoption of bitcoin—something that's been a long-term goal of Coinbase's CEO Brian Armstrong.



An essential API test verifies that an API is capable of connection, and that it is sending and receiving data. At some level, the QA team should include security testing. API messages must verify security at both ends of a data exchange. In addition to connectivity and security, verify database validity. If the APIs allow invalid data during an exchange, the database and applications are susceptible to failure from an unexpected source. Data validity is critical for API, database and application communication. To vet these areas, make sure to test error conditions as well. The API developer should share the error codes that will generate when the system rejects an incoming message for security or data issues, when messages are in the wrong format and when the API endpoint is down or non-functional. The QA engineer should verify that the API returns the data the IT organization expects across systems. Many applications have integrated components, such as a web portal and a mobile app.


A CISO Offers Insights on Managing Vendor Security Risks

"You should absolutely be applying some third-party risk assessment methodology," Decker stresses in an interview with Information Security Media Group. "Look at these third-party organizations and understand what type of security practices they have in place. You need to understand what kind of data you're putting into those systems and how important these third-party suppliers are to your operations." For inherently high-risk vendors, he says, organizations should "have a corresponding level of scrutiny and control around how those vendors are actually applying security around your systems, or as an entry point into your environment." Organizations need to ensure that the terms and conditions that they include in their contracts with vendors "not only have some technical components about the data that's going into their environment, [but also] the components where they're connecting to, a back channel," he says. They not only need to specify what kinds of controls they want vendors to have in place, but also "make sure there are the appropriate liabilities that are truly accounted for in that contract," he adds.



Quote for the day:


"Problem-solving leaders have one thing in common: a faith that there's always a better way." -- Gerald M. Weinberg


Daily Tech Digest - December 27, 2019

Exposed databases are as bad as data breaches, and they're not going anywhere


If your data is exposed in an unsecured database, experts say you have to treat the situation the same way you would if the data had been stolen. "You need to engage proactively in minimizing your risk," said Eva Velasquez, president of the Identity Theft Resource Center. Medical service provider Tu Ora Compass Health said the same thing to nearly 1 million patients when it revealed that its poorly configured website had exposed patient health insurance data. Patients should "assume the worst" and act as though hackers had accessed the data, the company said. What's the worst that can happen? Stolen information makes it easier for identity thieves to pretend to be you. When combined with what you share on social media, for example, your medical record number could allow someone else to use your health insurance. The Identity Theft Resource Center hosts a service called Breach Clarity that helps you decide what steps to take after your data is compromised. The advice depends on what kind of information was involved. If your log-in credentials are exposed, you'll want to reset your passwords. If it's your Social Security number, you'll want to watch your credit report for signs that someone's opening up new lines of credit in your name.



Introduction to ELENA Programming Language

Methods in ELENA are similar to methods in C# and C++, where they are called "member functions". Methods may take arguments and always return a result (if no result provided "self" reference is returned). The method body is a sequence of executable statements. Methods are invoked from expression, just as in other languages. There is an important distinction between "methods" and "messages". A method is a body of code while a message is something that is sent. A method is similar to a function. in this analogy, sending a message is similar to calling a function. An expression which invokes a method is called a "message sending expression". ELENA terminology makes a clear distinction between "message" and "method". A message-sending expression will send a message to the object. How the object responds to the message depends on the class of the object. Objects of differents classes will respond to the same message differently, since they will invoke different methods. Generic methods may accept any message with the specified signature.


Amazon now allows developers to combine tools such as Amazon QuickSight, Aurora, and Athena with SQL queries and thus access machine learning models more easily. In other words, developers can now access a wider variety of underlying data without any additional coding, which makes the development process faster and easier. Amazon’s Aurora is a MySQL-compatible database that automatically pulls the data into the application to run any machine learning model the developer assigns it. Then, developers can use the company’s serverless system known as Athena to obtain additional sets of data more easily. Finally, the last piece of the puzzle is QuickSight, Amazon’s tool used for creating visualizations based on available data. The combination of these three tools will provide a far more efficient approach to the development of machine learning models. During the announcement, Wood also mentioned a lead-scoring model that developers can use to pick the most likely sales targets to convert.


istock-802780432.jpg
Ranking the obstacles involved in firewall management, 67% of those surveyed pointed to the initial deployment and tuning measures, 67% cited the process of implementing changes, and 61% referred to the procedure for verifying changes. Cost is another hurdle with firewalls. Depending on the size of the organization and the type of firewall, a single unit can cost anywhere from hundreds to thousands to tens of thousands of dollars and up. Some 68% of the respondents said they have a hard time receiving the necessary initial budget to purchase firewalls, while 66% bump into difficulty getting the funding to operate and maintain them. Tweaking the rules on a firewall is yet another taxing task. Changes to code, applications, and processes can occur fast and furiously, requiring frequent updates to firewall rules. But a single firewall update can take one to two weeks, according to the survey. And such changes can sometimes be trial and error. More than two-thirds of the respondents cited the difficulty of testing changes to firewall rules before deploying them. The lack of a proper testing platform can lead to misconfigured rules that break applications.


Hugh Owen, Executive Vice President, Worldwide Education at MicroStrategy asserts "Enterprise organizations will need to focus their attention not just on recruiting efforts for top analytics talent, but also on education, reskilling, and upskilling for current employees as the need for data-driven decision making increases—and the shortage of talent grows." Skills shortages show up everywhere, especially in AI. John LaRocca, Managing Director for Europe/NA Operations at Fractal Analytics, comments that "The demand for AI solutions will continue to outpace the availability of AI talent, and businesses will adapt by enabling more applications to be developed by non-AI professionals, resulting in the socialization of the process."  In that same vein, noted industry expert Marcus Borba, at Borba Consulting, remarks, in a report from MicroStrategy, that "the demand for development in machine learning has increased exponentially. This rapid growth of machine learning solutions has created a demand for ready-to-use machine learning models that can be used easily and without expert knowledge."


Google Publishes Its BeyondProd Cloud-native Security Model

In zero-trust networking, protection of the network at its outer perimeter remains essential. However, going from there to full zero-trust networking requires a number of additional provisions. This is by no means easy, given the lack of standard ways to do it, adds Brunton-Spall: You can understand [it] from people who've done this, custom-built it. If you want to custom build your own, you should follow the same things they do. Go to conferences, learn from people who do it. Filling this gap, Google's white-paper sets a number of fundamental principles which complement the basic idea of no trust between services. Those include running code of known provenance on trusted machines, creating "choke points" to enforce security policies across services, defining a standard way to roll out changes, and isolating workloads. Most importantly, These controls mean that containers and the microservices running inside them can be deployed, communicate with one another, and run next to each other, securely, without burdening individual microservice developers with the security and implementation details of the underlying infra structure.


apples oranges slices mixture puzzle balance opposites fruit  savatore gersace flickr
What if we’re leading change all wrong? The book “Make it Stick: The Science of Successful Learning,” by Peter C. Brown, Henry L. Roediger III and Mark A. McDaniel highlights stories and techniques based on a decade of collaboration among eleven cognitive psychologists. The authors claim that we’re doing it all wrong. For example, we attempt to solve the problem before learning the techniques to do so successfully. Using the right techniques is one of the concepts that the authors suggest makes learning stickier. Rolling out data-management initiatives is complex and usually involves a cross-functional maze of communications, processes, technologies, and players. Our usual approach is to push information onto our business partners. Why? Well, of course, we know best. What if we changed that approach? This would be uncomfortable, but we are talking about getting other people to change, so maybe we should start with ourselves. Business relationship managers stimulate, surface, and shape demand. They’re evangelists for IT and building organizational convergence to deliver greater value. There’s one primary method to accomplish this: collaboration.


Setting Management Expectations in Machine Learning

Business leaders often forget that machine learning algorithms are not a panacea that can be thrust into a given use case and expected to magically deliver value on their own. Algorithms rely on large, accurate, datasets to train and generate predictions. Data science is just the end result of a long process of data collection, cleansing, and tagging that requires significant investment. That’s why it’s important to have a robust Data Governance strategy in place at your business. Unfortunately, management often forgets this. Having failed to make the necessary investments in Data Governance, they nonetheless expect their data scientists to “figure it out.” Even where management has made the necessary investments in Data Governance and you have access to a large, healthy, internal dataset, there are certain functions you will still have difficulty performing. These most prominently include anything that requires you to leverage customer data. The frequency of widespread breaches and scandals involving the misuse of data, along with the accompanying rise in government regulation, has made it more difficult than ever to leverage customer data within businesses’ ML systems.



"As more states follow California's lead and push forward with new privacy laws, we'll likely see increased pressure on the federal government to take a more proactive role in the privacy sphere," said Mary Race, a privacy attorney in California. The Senate Commerce Committee held a hearing in December to discuss two potential frameworks, both of which seek to set a federal standard and designate regulators to enforce the law. Lawmakers expressed bipartisan support for privacy laws though no legislation has moved forward. Still, several key aspects of a prospective law were up for debate at the hearing. The Republican framework, submitted by Sen. Roger Wicker of Mississippi, would preempt state data privacy laws, and would limit enforcement to the FTC. Sen. Maria Cantwell of Washington, who submitted the Democratic bill, has said she's considering letting consumers directly sue companies, and would not supersede state laws. While federal law supersedes state law in general, many federal laws leave room for states to enact tougher requirements on top of the baseline set by US legislators.



How Data Subject Requests are at the heart of protecting privacy

Not only has data proliferated, but it’s also mutated into derivative forms. Customer data is often collected across multiple channels without being linked to a master identifier, and the definition of what is considered PII is continuing to change. The other reason the DSR search process is difficult is that many organizations still rely on questionnaires and spreadsheets for data discovery. These manual processes are inefficient at best, and incredibly inaccurate at worst. Consider that a single bank transaction might be replicated across 100 systems. Successfully fulfilling a DSR for that customer could require multiple people to manually search all those systems, and the accuracy and completeness may be questionable. Not only would the individual’s privacy be compromised, but the bank would also have to defend the results with regulators. In an age of big data and automation, relying on manual processes to fulfill privacy laws seems unbelievably arcane, if not impossible given the sheer volume of data companies have. Fortunately, many organizations are beginning to realize the complexity and importance of the DSR process and are looking to automate it.



Quote for the day:


"People not only notice how you treat them, they also notice how you treat others." -- Gary L. Graybill


Daily Tech Digest - December 26, 2019

Decade in review: Reflections on the last 10 years in the tech industry

2020 past circle #3
Technology has irreversibly gone from the sole province of the back office to a key element of most organizations' products and services, and oftentimes a strategic and competitive differentiator. This transition is furthering a trend from earlier in the decade, whereby technology in some organizations was splitting between core "keep the lights on" services in the back office, and technologies that powered products and transformational initiatives. In extreme cases, the CIO has become a utility player while other functions like marketing or product development get the preponderance of a company's technology spend. On the other extreme are CIOs who have become brokers of technology services that power marketing, product development, and digital transformation while pushing management of back-office systems to staff or an external vendor. As back-office systems increasingly become commodities that can be purchased from a cloud vendor, it appears that the operationally oriented CIO will become increasingly less important and disappear from the executive ranks at many companies.


Corporate IT training gets high profile in 2020


Multiple training methods may prove necessary, but Becirovic advised establishing a common platform for delivery -- instead of creating a series of one-off training vehicles. The company's Accenture Future Talent Platform plays that unified-platform role. "It's difficult to create a one-size-fits-all model for upskilling talent," Becirovic said of platform-building. "But the most effective approaches focus on 'learning anytime, anywhere' through digital technologies. Mobile- and tablet-based learning are gaining traction -- mobile learners study on average 40 extra minutes per week." Becirovic also cited the power of social media and collaboration. "Engaging learners through social collaboration enhances learning," he said, noting employees have been found to spend three times more time on social-enabled tools.


AI makes inroads in life sciences in small but significant ways: Lantern Pharma’s quest

lantern-pharma-radr-graphic.png
Lantern is working on three different therapies that had been left aside after showing some progress in clinical trials but not winning approval. They address a variety of cancers, including prostate, ovarian, and non-small-cell lung cancer. Lantern is a tiny company with just nine full-time employees and three contractors. They are not going to run massive, multimillion-dollar drug trials on their own. But what they can do is test drugs in simulation before a trial happens and then partner with larger firms. The company's software platform is called "RADR," which stands for "Response Algorithm for Drug Positioning and Rescue." Not all of this is artificial intelligence. The process starts with choosing which of thousands of genes are likely responsive based on historical statistics about those genes, what is known as "feature selection." That process leads to a shortlist of tens of genes that may be responsive. Lantern takes tissue samples from prospective patients and tests the individuals' unique genetic profiles, looking for the combination of genes that represents a "signature" that may be predictive of drug response.



A Transformation Journey for a Distributed Development Organization

This vision dictates an Agile organization and setup. We’re aware there is no end to bringing better and faster customer value. This is an ongoing journey, where we strive for perfection, sometimes taking major steps, but mostly changing minor things.Those minor things add up and create a difference. One of the biggest challenges in our Agile organization is reusing some of the existing skill sets in teams, while hiring new teams in different locations for the new domains. This approach allows us to use the strengths of the company, but brings a challenge of creating one big team with a single goal. This also makes our experience stronger and more educational for other setups inside and outside the company. Basically, we reuse our existing multi-functional printer (MFP) related technical skills, as well as sales and support skills, and add newly hired teams for new functionalities that do not have any counterpart in the enterprise.


Running Android on PC: A Developer’s Overview


There are a number of different ways to do this, but keep in mind that you can’t run virtual machines on Windows Home; you need the Pro, Enterprise or Education editions. Memu Play is an application that runs an Android emulator; it’s targeted at games. It runs on your Windows PC and integrates mice and keyboard. It’s free but shows adverts. Despite uninstalling Hyper-V from my Windows 10 and enabling VT-x in the BIOS (and rebooting way too many times), I was unable to make Memu Play work (or play, if we want to be punny) because it claimed I was still running Hyper-V. On my wife’s laptop, it ran fine and was very slick. The website has links to many popular Android games that you can download and run. Like Memu Play, Bluestacks is another emulator focused on Android games; moreover, it claims a speed advantage over Android smartphones. It uses Android N (7.1.2). This potentially isn’t a problem, as my experience with Android compared to iOS is there’s longer support for games on older OS versions. GenyMotion takes a different approach, with two offerings targeted at developers: desktop or cloud-based.


The Bug That Got Away

There are countless systems, big and small, that are just riddled with the things. As an engineer I know this very well, as I've contributed to my fair share of them. I've been a software engineer over ten years or so and I've always considered myself to be thorough, especially when it comes to tracking down a bug: the research, the deep diving, and finally: the fix. As with any bug - one of the first steps to fix it, is being able to reproduce it. I spoke with our QA team and they weren't immediately able to reproduce it, but mentioned they would look into it further. Hours pass and I receive another message something to the effect of: QA Person: Rion, I just spun up a fresh new environment and I can reproduce the issue! At this point, I'm excited. I had been fighting with this for over a day and I'm about to dive down the bug fixing rabbit hole on the way to take care of this guy. I log into the new environment, and sure enough, QA was right! I can reproduce it! I should have this thing knocked out in a matter of minutes and my day is saved!


What is WireGuard? Secure, simple VPN still in development

secured vpn tunnel
For one, the WireGuard protocol does away with cryptographic agility -- the concept of offering choices among different encryption, key exchange and hashing algorithms -- as this has resulted in insecure deployments with other technologies. Instead the protocol uses a selection of modern, thoroughly tested and peer-reviewed cryptographic primitives that result in strong default cryptographic choices that users cannot change or misconfigure. If any serious vulnerability is ever discovered in the used crypto primitives, a new version of the protocol is released and there’s a mechanism of negotiating protocol version between peers. WireGuard uses ChaCha20 for symmetric encryption with Poly1305 for message authentication, a combination that’s more performant than AES on embedded CPU architectures that don’t have cryptographic hardware acceleration; Curve25519 for elliptic-curve Diffie-Hellman (ECDH) key agreement; BLAKE2s for hashing, which is faster than SHA-3; and a 1.5 Round Trip Time (1.5-RTT) handshake that’s based on the Noise framework and provides forward secrecy.


How Amazon customer experience became e-commerce standard

Technology can partially bridge the divide and make it more of a fair fight, say analysts, e-commerce cloud vendors and retailers. Different parts of the Amazon customer experience can be replicated in order to meet consumer -- and, increasingly, business -- buying expectations for which Amazon has set the standard. Cloud e-commerce vendors such as BigCommerce, Shopify, Adobe Magento, Salesforce and Oracle must provide customers with payment processing, a shipping network and SEO to infuse their shopping sites with as many elements like the Amazon customer experience as they can. On top of all that, they also must enable the ability to sell on or off Amazon's marketplace. "Amazon set the bar for a lot of the Western world for integrated, end-to-end customer experience," said Des Cahill, Oracle head CX evangelist. "It's come into B2B as well; it's not just a B2C phenomenon. We build into our platform technologies and services that will enable our customers to deliver that same Amazon-like consistency and personalized experiences."


SaaSOps: The next step in Software as a Service evolution

Businessman using mobile smartphone and connecting cloud computing service with icon customer network connection. Cloud device online storage. Cloud technology internet networking concept.
SaaSOps is a result of the explosion of SaaS in the enterprise. The term is new, but the concept has been gaining momentum for quite some time. You may have heard it being referred to as everything from digital workplace ops, to IT operations, to SaaS administration, to cloud office management and end-user computing, just to name a few. But, ultimately, the gist is the same. SaaSOps is a set of disciplines—all the new responsibilities, processes, technologies, and people you need to successfully enable your organization through SaaS. .... SaaSOps ultimately unlocks the potential SaaS can have on any given organization: increased productivity, better collaboration, and a happier workforce. In a world where SaaSOps is widely adopted—which I predict will be in the next 3 to 5 years—users can achieve optimum levels of productivity through SaaS, and IT can effectively manage the proliferation of these best-in-breed applications. When companies first start their SaaS journey, adoption is low.


IT: Managing Choice, Change, Careers in 2020

Decisions
Speaking of strategic value, if IT can’t deliver that, their role risks being diminished or outsourced altogether. This is why managing both choice and change are so important. Having a world-class understanding of cloud technology alone isn’t enough for success career-wise when these other elements are having such a strong impact on the business. Deep technology knowledge may have been sufficient with the PBX since it didn’t have much impact on the business beyond providing dial tone. There was nothing transformational about the PBX, so IT didn’t really need to be concerned beyond providing reliable telephony service. The landscape has shifted dramatically with the cloud, and that shift is key to how IT’s role is changing. To make that point, I’ll return once more to Krapf’s talk, where he shared some findings from Enterprise Connect’s 2018 Salary & Career Survey. Below is a comparison of the skillsets IT believes will be important for career success going forward, along with where their current skills are strongest.



Quote for the day:


"A healthy attitude is contagious but don't wait to catch it from others. Be a carrier." -- Tom Stoppard


Daily Tech Digest - December 25, 2019

Is AI the Antidote to Network Complexity?

Is AI the Antidote to Network Complexity?
In what might sound like the plot of a “Terminator” movie, Gold says ML and later AI will play a big part in making this complexity manageable. He said EXFO sees ML being used to learn and identify what normal network traffic looks like to better identify the root cause of an issue when it crops up. Needless to say, our IoT devices probably won’t be turning against us anytime soon. But while EXFO is preparing to launch an AI and ML-based AIOPs platform in the near future, other vendors like Masergy, Nyansa, and Vitria have already opened their arms to our AI overlords. In fact, the degree of visibility made possible by AI has implications for more near-term applications than 5G and large scale industrial IoT including gains in operation efficiency, SD-WAN, broadband performance, compliance testing, and improved customer experiences. ... Nyansa’s Voyance AIOPs platform works by pulling in real-time and historical data from devices on the network. This information is then processed using a series of ML and AI algorithms designed specifically to solve network challenges.


WebAssembly hasn’t grabbed JavaScript developers

WebAssembly hasn’t grabbed JavaScript developers
JavaScript developers are still getting their feet wet with WebAssembly, a survey of JavaScript developers has found.  Survey data published in the State of JavaScript 2019 report issued this week indicates that WebAssembly has drawn plenty of attention but not many users among JavaScript developers. Only 8.6 percent – 1,444 developers – of the 16,694 persons who said they were aware of WebAssembly had actually used it. WebAssembly has been highly touted as a mechanism to speed up web applications and support the use of languages such as C, C++, and Rust for client-side and server-side web development. JavaScript is a compilation target for WebAssembly. The State of JavaScript 2019 report, produced by developers Sacha Greif and Raphael Benitte, is based on a survey of 27,717 JavaScript developers worldwide, who weighted in on everythong from JavaScript language features and flavors to frameworks, utilities, editors, and testing tools.


AI Is Transforming Real-Time Data Governance
By adding AI to the mix, businesses can detect anomalies. For instance, if there is a breach in a data center, management can train an AI-based solution to identify any cyber attack. For this purpose, it goes through machine learning algorithms and consumes voluminous amounts of data. As a result, when a cyber threat emerges, AI can pick out the pattern and notify the authorities in time before data is compromised. This also means that AI can add a lot of automation in the privacy, compliance, and security of data. Hence, companies can ensure that they have a 24/7 protector that, unlike human resources, can tirelessly monitor their data transmissions.  AI makes sure that data reaches the right user without getting intercepted by cybercriminals who may employ man-in-the-middle, spear phishing, ransomware, spyware, or any other cyber attack. Essentially, AI is democratizing data governance. For instance, AI is used in automated process discovery to analyze behavioral data that is generated during data processing. In this way, digital records are derived from behavioral data.


Closeup photo of watch
There is no denying that JWT is a cool breeze and a relief from the feature insanity of OAuth. I once spent a week trying to understand OAuth, I had to give up. There was simply no way I could wrap my brain around it. I could explain JWT to a 5-year-old child in less than 5 minutes. If OAuth is a scrapyard of madness and radioactive waste, JWT is a green field swimming in warm rays of sun after the morning dew has sprinkled the fresh grass made. A JWT token consists of three simple parts: a header describing the token, a payload that's the actual token, and a cryptographically secured signature, ensuring the token was created by a trusted source. All three components are base 64 encoded, separated by a ".", concatenated, and normally provided as a Bearer token in the Authorization HTTP header of your HTTP REST invocations — dead simple in fact. The reason why this is secure is because some sort of "secret" has been used when creating the signature, which is the last part of the token. Without the secret, you might as well try to brute force the unified theory of science. The thing is solid as a rock! Yet as simple as a cup of coffee on a Sunday morning.


Certainly, McLaren’s Applied Technologies arm is growing, using its knowledge of sensors, data collection and analytics to drive innovation in other industries. As well as being the sole supplier of batteries for Formula E, a class of motorsport that uses only electric cars, the business is also the main supplier of temperature and pressure sensors in F1. It also supplies its ECU to other F1 and Indy Car teams. “We are moving from engineering services,” says Neale, a nod towards increased digital innovation and development. “As well as connected technologies in other sports such as football and rugby, we are working in healthcare and the public transport sector.” ... “We need 5G now,” says Neale, claiming that this is the missing connectivity piece that will enable increased capability at the edge. He sees this as an essential development leap, the sort of advance that will create an acceleration in innovation and increased efficiency in operations.


networking model
“The evolution of business applications from monolithic constructs to flexible containerized workloads necessitates the evolution of the edge itself to move closer to the application data,” Uppal said. “This, in turn, requires the enterprise network to adjust and meet and exceed the requirements of the modern enterprise.” Such changes will ultimately make defining what constitutes the edge of the network more difficult. “With increased adoption of cloud-delivered services, unmanaged mobile and IoT devices, and integration of networks outside the enterprise (particularly partners), the edge is more difficult to define. Each of these paradigms extend the boundaries of today's organizations,” said Martin Kuppinger, principal analyst with KuppingerCole Analysts AG. “On the other hand, there is a common perception that there is no such perimeter anymore with statements such as “the device is the perimeter” or “identity is the new perimeter”. To some extent, all of this is true – and wrong. There still might be perimeters in defined micro-segments. But there is not that one, large perimeter anymore.”


AI strategy is still pretty new—and for many organizations, nonexistent. According to a recent IDC survey, half of responding businesses believed artificial intelligence was a priority, but just 25 percent had a broad AI strategy in place. ... And government is behind the private sector on the strategy curve. In a 2019 survey of more than 600 US federal AI decision-makers, 60 percent believed their leadership wasn’t aligned with the needs of their AI team. The most commonly cited roadblocks were limited resources and a lack of clear policies or direction from leaders. But a coherent AI strategy can attack these barriers while building a compelling case for funding. A winning plan establishes clear direction and policies that keep AI teams focused on outcomes that create significant impacts on the agency mission. Of course, strategy alone won’t realize all the benefits of AI; that will require adequate investments, a level of readiness, managerial commitment, and a lot of planning and hard work. But an effective AI strategy creates a foundation that promotes success.


Customer-driven open source is the future of software 

Customer-driven open source is the future of software
Despite the fact that open source has never been more broadly used, apparently we’re in an “open source sustainability” crisis. It’s the same “crisis” we’ve been in for the past 20 years, with persistent warnings that this can’t last. ... Red Hat CEO Jim Whitehurst has been agitating for customer-driven open source for well over a decade: “Ultimately, for open source to provide value to all of our customers worldwide, we need to get our customers not only as users of open source products but truly engaged in open source and taking part in the development community.” There are many reasons why such customer-driven (or user-driven) innovation might be best, but Linux veteran Matt Wilson put it this way: “If I can risk predicting the future, I think you’ll see a lot more new open source software emerging from companies that are building it and using it to solve their business problems. And it will be better because of a positive feedback loop of putting the software into an applied practice.”


How Big Data And AI Work Together

Big Data And AI Work Together
Big data services companies were not in much use in the earlier times but now they are and this is because of the development of artificial intelligence and machine learning. Earlier, having big data was a problem as there were no methods to interpret such a large amount of data that too, in a wide variety. But this has changed now. Artificial intelligence and machine learning help a lot in interpreting all different types of data, be it in textual form or pictures, videos, etc. With the help of artificial intelligence and machine learning, enterprises are able to unleash a whole lot of different uses from the big data they have and acquire continuously. Let’s understand a bit more. The type of relationship that exists between artificial intelligence and big data or big data analytics solutions is of the reciprocative type. Artificial intelligence is somehow meaningless without big data. If artificial intelligence does not have any data, how or on what will it work? One can say that it is mainly made for this purpose, to tackle large amounts of data and extract out meaningful insights from it.


DeepMind Builds Neural Networks that Simulate Imagination

Imagination is one of those magical features of the human mind that differentiate us from other species. From the neuroscience standpoint, imagination is the ability of the brain to form images or sensations without any immediate sensorial input. Imagination is a key element of our learning process as it enable us to apply knowledge to specific problems and better plan for future outcomes. As we execute tasks in our daily lives, we are constantly “imagining” potential outcomes in order to optimize our actions. Not surprisingly, imagination is often perceived as a foundational enables of planning from a cognitive standpoint. Incorporating imagination into artificial intelligence(AI) agents have long been an elusive goal of researchers in the space. Imagine AI programs that are not only able to lean new tasks to plan and reason about the future. Recently, we have seen some impressive results in the area of adding imagination to AI agents in systems such as AlphaGo.



Quote for the day:


"Leadership is the wise use of power. Power is the capacity to translate intention into reality and sustain it." -- Warren Bennis


Daily Tech Digest - December 24, 2019

The neural basis of confirmation bias

This shows a woman's head and neural network
Researchers combined functional magnetic resonance imaging (fMRI) with the behavioral task. Participants’ blood oxygen level-dependent (BOLD) variables were examined through moderated mediation analysis, capturing a relationship between brain activity and multiple levels of performance, and testing whether the mediation is different for conditions of agreement and disagreement. When participants learned their partners agreed with their opinions, they significantly increased their bets, thus confirming they were confident with their decision. Participants only slightly decreased their wagers when their partners disagreed. The impact of the partner’s opinion was far greater when it confirmed the player’s judgment, and the partner’s opinion was more likely to be disregarded when it was contradictory — consistent with confirmation bias. The functional brain imaging data revealed a region whose activity modulation was associated with decision-making and memory. The posterior medial prefrontal cortex mediated the strength of confirming opinions over disconfirming opinions, and tracked agreements more closely than disagreements.



Building Tech Solutions And Giving Back

Shot of a Working Data Center With Rows of Rack Servers. Led Lights Blinking and Computers are Working.
One of my favorite success stories is a lady that had gone through Coburn Place as a resident, which means she was abused and needed to find shelter for herself and her children. We were doing our Holiday Home party where we provide Christmas trees, ornaments and other holiday decorations to the residents to have for their apartments. She came in with a pink t-shirt, her hair was pink and she came up to me asking for pink lights for her Christmas tree. She had a great smile and her positive attitude was contagious. I asked her why she wanted pink lights and she said her cancer was gone and she wanted everything to be pink for the Holidays. For any person to have gone through what she did and to have that positive attitude and outward happiness to me, defines success. ... Focusing on what we know about our industry and using it to educate the next generation through their teachers will provide more connection between what the kids are learning and a potential career in technology.


4 tips to help keep your APIs safe

istock-854566388-1.jpg
Even when enterprises do all the right things and make sure everything is protected, they can still be at risk of breaches or attacks thanks to third-party services. Waugh said a fair chunk of the breaches he sees are not direct attacks on a company system but a compromise of a third party that has access to that system to process data.  "As an industry, we do a really poor job of understanding risk when it comes to third parties. As much as we work to keep ourselves secure, we have a very limited understanding of what third parties we have out there. How do we secure those?" he said. Companies should have a thorough understanding of third-party partners accessing their data, sending them security questionnaire, requests for certifications or demanding reports. But even this, Waugh said, can still leave companies vulnerable to attacks. Last year India's national ID database, which has identity and biometric information like fingerprints and iris scans on more than 1.1 billion registered Indian citizens, was exposed through a vulnerable API. According to ZDNet's Zach Whittaker, a utility provider, Indane, had access to the Aadhaar database through an API, which the company relies on to check a customer's status and verify their identity.


Scientists Develop World’s First ‘Unhackable’ Encryption System

Largest encryption key cracked
The chip designed by the researchers generates one-time-only key when data is sent through it. The data is stored as light and passed through a specially designed chip that bends and refracts the light to scramble the information. The trick behind the tech is that the bending and refracting of light is unique every time as it depends upon the data being sent through the chip. It would be safe to say that the chip is a physical realization of the OTP mechanism which is popularly used today to authenticate several services. A paper published in the Nature journal titled “Perfect secrecy cryptography via mixing of chaotic waves in irreversible time-varying silicon chips” says that the new technology exploits correlated chaotic wavepackets, that are mixed in inexpensive and CMOS compatible silicon chips. The specially engineered chips can deliver 0.1 Tbit of different keys for every mm of the length of the input channel. According to Professor Andrea Di Falco from the School of Physics and Astronomy at St Andrews University, “It’s the equivalent of standing talking to someone using two paper-cups attached by string.”


How Can a Digital Twin Create a Seamless Workplace for Employees?

Robot twins on a colorful background
The way we work is changing. If you look at industry news headlines, you’ll see articles about the rise of shared working spaces, flexible work hours and remote working. Yes, Millennials and Gen Z’ers are big drivers of this change, but the conversation isn’t limited to the younger generations. In fact, in today’s economy, there is a multigenerational global talent war in many industries — like tech, finance and telecom — where workers of all ages and demographics are asking for flexible working arrangements, remote work and a more holistic perspective on productivity in exchange for their loyalty. Moreover, in our increasingly connected world, employees want their office environments to be as smart as their homes, cars and digital communities, with the ability to create a personalized experience. However, many traditional office buildings still operate in the “dark” with limited use of modern technology and little ability for employees to interact dynamically with their environment.


Slowing Data Security Tool Sprawl in a Hybrid Multicloud World

IT technicians work on laptops in a data center
As information-as-a-service (IaaS), software-as-a-service (SaaS) and database-as-a-service (DBaaS) consumption becomes commonplace for enterprises, their data is becoming more dispersed than ever, making it extremely difficult for organizations to discover, visualize and protect their sensitive data across multiple environments. The same IBV study found that only 38 percent of organizations have the procedures and tools in place to operate a multicloud environment. Moreover, as data and workflows continue to move to the cloud, security teams are becoming inundated with security and compliance point tools, each designed to be used within specific environments and/or use cases. This is leading to what many refer to as “tool sprawl.” Tool sprawl can add significant operational complexity, not just in terms of security teams having to leverage disjointed dashboards and piecemeal reports, but it can lead to ineffective workflows and processes as well.


These AI-Powered Digital Health Devices Debut At CES 2020

High Technologies in the future. Young woman's eye and high-tech concept, augmented reality display, wearable computing
Bruce Sharpe, Founder and CEO of Singular Hearing said that more than 466 million people worldwide are affected by some degree of hearing loss. “This staggering number is rapidly increasing. Modern hearing aids are miracles of miniaturization, customizability and flexibility. It's not their fault that noise has continued to be a problem,” said Sharpe. “Even powerful desktop computers were not good at handling noise. That is, until the last few years when good machine learning approaches came along.” “Now we finally have new, more effective solutions for noise, thanks to the perfect storm of new machine learning algorithms, the availability of better data sets, powerful GPUs for training, and smartphones in everyone's pocket that are capable of running the results,” added Sharpe. The Canadian startup’s new product, HeardThat, is an app that uses AI to tune out background noise, which enables users with hearing loss to hear speech more clearly.


How the CIO fought their way back from the edge of extinction


Some CIOs chose to draw a line in the sand. Their IT departments had a finite amount of resources and couldn't afford to implement dashboards for every person in every department. However, line-of-business employees weren't to be denied – and they had a crucial ally in the form of cloud computing. If the CIO wouldn't provision the tools they wanted, then workers chose to simply use their own departmental budgets to buy their own applications. Workers started circumnavigating IT departments entirely, establishing direct relationships with cloud vendors and signing up to their own software-licensing deals. The rise of shadow IT led many experts to speculate that the role of CIO was on borrowed time. After all, who needs a traditional IT director when the rest of the business can buy their own applications and run them on their own devices? The answer, in the end, turned out to be pretty much everyone. While it's easy to buy an application, it's not so easy to ensure the governance of a service is established, that the date is secure and that the software can be turned off – and the data extracted easily – when the contract is cancelled. It was here that CIOs excelled.


5 Ways Business Data Is Changing How People View Green Energy

using big data for clean energy
While many companies are analyzing big data and choosing to adopt green energy, others believe that either there isn’t enough information to support it or it’s too confusing. Sustainability efforts are often difficult to measure because they affect society on a large scale. The specific implications, benefits and disadvantages are therefore hard to pinpoint and visualize. Further, the impact of these efforts is not always immediately obvious. As green energy has only become mainstream within the last few decades, there’s only so much evidence to support it. This information also comes in the form of different metrics and measurement systems. Businesses don’t know which one metric is better than the other and which would best fit their energy needs. Even leading brands need help deciding which ones will help them improve their efforts. While data to support green energy does exist, there’s still much to be understood about it. In the future, the green energy sector can gather more supporting evidence and information by continuing to monitor sustainability efforts.


IoT Security: How Far We've Come, How Far We Have to Go

Compounding the danger of IoT threats is the rise of nation-state attackers, who are targeting firmware at scale or leveraging connected devices in DDoS attacks. They don't have to attack a major entity in order to have far-reaching effects, either: as NotPetya demonstrated, a nation-state actor could target one single component supplier to have devastating consequences. Organizations' attitude toward IoT security is similar to their approach to smartphones several years back, Heiland says. Now, they're in the early stages of how they'll improve their business model and put together processes to stay secure. At the same time, standards and regulations are emerging to inform manufacturers how to build security into these devices from the start. A combination of poor device security and higher interest among attackers is driving businesses to pay more attention to the IoT. "The attack surface they're responsible for has grown so immensely," says Mike Janke, CEO of DataTribe, where a group of advisory CISOs uses the term "shadow IoT" to refer to the smartwatches, headphones, and tablets appearing on networks.



Quote for the day:


"If you don't make things happen then things will happen to you." -- Robert Collier


Daily Tech Digest - December 23, 2019

The Top Seven Technology Trends for 2020

Tokenomics Will Become More Important in Society, Regulation Follows Slowly
Tokens will become an essential tool for the organisation of tomorrow. In fact, tokens will be the fuel of the decentralised economy. They will define our future society thanks to their unique characteristics. First of all, tokens are programmable. This means that certain features or rules can be embedded in the token. These rules could be related to dividend release (the longer you hold a token, the more dividend you will get), voting rights (the longer and the more tokens you have, the more voting rights you will get) or other privileges. These rules can then become an effective way to incentivise ownership and ensure price stability. A second significant benefit of tokens will be that it will result in an explosion of liquidity in the world economy. Security tokens are tokens that allow the owner of that token a (future) stake in a company or asset such as a painting, a car or a building, whether it be in the form of dividends, revenue share, or a price appreciation.



Twitter said it lacked evidence that malicious code was ever inserted into the app or that the vulnerability was exploited, but it admitted that doesn't mean it hadn't been exploited.  "We don't have evidence that malicious code was inserted into the app or that this vulnerability was exploited, but we can't be completely sure so we are taking extra caution," Twitter said in a blog post. The bug didn't affect its iOS app for iPhone users. It's notifying Android users through email notifications and app notifications. "We have taken steps to fix this issue and are directly notifying people who could have been exposed to this vulnerability either through the Twitter app or by email with specific instructions to keep them safe. These instructions vary based on what versions of Android and Twitter for Android people are using," Twitter said. A note sent to one Twitter user read: "Please update to the latest version of Twitter for Android as soon as possible to make sure your account is secure." The Twitter Support account clarified on Twitter that the issue was fixed in "version 7.93.4 (released Nov. 4, 2019 for KitKat) as well as version 8.18 (released Oct. 21, 2019 for Lollipop and newer)."


Why Cognitive Technology May Be A Better Term Than Artificial Intelligence

uncaptioned
In general, most people would agree that the fundamental goals of AI are to enable machines to have cognition, perception, and decision-making capabilities that previously only humans or other intelligent creatures have. Max Tegmark simply defines AI as “intelligence that is not biological”. Simple enough but we don’t fully understand what biological intelligence itself means, and so trying to build it artificially is a challenge. At the most abstract level, AI is machine behavior and functions that mimic the intelligence and behavior of humans. Specifically, this usually refers to what we come to think of as learning, problem solving, understanding and interacting with the real-world environment, and conversations and linguistic communication. However the specifics matter, especially when we’re trying to apply that intelligence to solve very specific problems businesses, organizations, and individuals have. ... Since no one has successfully built an AGI solution, it follows that all current AI solutions are narrow. 


Product Goals, not Sprint Goals

Product Goals will not all be the same size. Some will be easier than others to achieve. Some may take longer than others to achieve. For instance, some goals may be achieved within a Sprint, whereas others may take multiple Sprints to achieve. The point is that these Product Goals are worth achieving in terms of impact to your users, business and stakeholders. Sprint Goals on the other hand, are goals crafted specifically for a Sprint. Let us consider an example: One of the outcomes (product goals) we were working towards was the ability of our customers to ‘view, maintain and pay invoices’. Without these three parts together, we had determined that our customers would not get a whole outcome that they could meaningfully use. Each part relied on the other. Yet, since we couldn’t fit all of it into one Sprint, we built them over one and a half. The Product Goal, ‘view, maintain and pay invoices’ if split into two Sprint Goals, ‘view and maintain invoices’ in the first Sprint and then ‘pay invoices’ in the next, then the second Sprint gets padded with other stuff that came next in the order, making the second Sprint Goal a hodge-podge of parts of two Product Goals.


When should you consider moving to Kubernetes? image
Migrating to Kubernetes works best if your company has already moved its platform to the cloud and has experience with containerisation, but is now beginning to have difficulties with scale or stability. A switch to Kubernetes can also be the right move if your team is having trouble managing your platform because it is spread across different cloud services. While most services offer their own container management tools, using a different one with its command-line interface for each service can quickly become cumbersome. With Kubernetes, you can use the same device for every container, no matter the platform it’s on. This can make container management easier and reduce training time if your platform relies on a few different cloud services. Because Kubernetes containers can also be deployed like a VM template, they typically require little to no configuration across services. As a result, Kubernetes shouldn’t be any more complicated to set up than a cloud service’s proprietary container management tool.


Watch out for phishing scams this Christmas

No longer overrun with work and only responding to emails days later – at which point you’ve already discovered that the email from your colleague that you skimmed but didn’t have time to open is bogus – you might now instead idly click the attached link as it’s something to keep you busy. And once you’ve taken the scammer’s bait, you expose your organisation to a world of trouble. Some phishing scams contain links to websites that replicate a real site with the intention of nabbing your login details, whereas others contain attachments loaded with malware. Either way, falling victim could cause you and your organisation a massive headache. With access to your username and password, criminals can break into your account and steal sensitive information. They might also try to leverage their attack by imitating you in an email a colleague, requesting information. As such, HR departments are a prime target for cyber criminals.


Use technology to accelerate through uncertainty


New solutions also help you harness data to generate insights that enable managers and senior leaders to make more objective, fact-based decisions, leading to better performance. For example, you can invest in supply chain technology that uses AI to generate more accurate demand forecasts and predict how and where the supply chain might break down in response to specific risks. But digital technologies have a customer-facing component, too, enabling business model innovation centered around new types of products and services. For example, companies increasingly offer mobile, cloud-based services rather than products, and customers no longer buy software outright but instead subscribe to software-as-a-service models. Even automakers, who have been selling cars the same way for decades, are rethinking how they do things. Most 4IR technologies are evolving rapidly, but they’re not risky. Rather, they’re market-tested solutions already in use across a range of industries.


Employee Engagement: 4 Keys to Delivering Exceptional Customer Experience

istock 1061723074
No matter how far removed a role may be from actual customer interaction, it still has something to do with customer experience. This is the truth of the new, customer-centric era of business, and it’s the way CIOs need to think about their organizations from top to bottom. According to 2019 State of the CIO research, 55% of CIOs are spending more time learning about customer needs as a way to create revenue-generating initiatives. How do you inspire employee engagement? The most engaged employees are those who feel their work really contributes to the success of your business. They’re the ones who feel they have a real impact on your products and services and the way customers engage with them. They’re people who embrace their purpose within your company, and are driven by it. With a customer-centric perspective, along with the right training and processes in place, every single one of your employees — from interns to the C-suite — can be a highly engaged employee. There simply needs to be an organization-wide sense of ownership of the customer experience. To take your company in that direction, follow these four steps.


2020 is when cybersecurity gets even weirder, so get ready


The continued expansion of the Internet of Things will greatly increase the number of devices and applications that security teams will have to protect. That's hard for teams that have been used to protecting just PCs and servers and now have to worry about everything from smart air-conditioning units or vending machines in the canteen, right through to power plants and industrial machinery. Half the battle for tech is likely to be just finding the stuff other parts of the business have accidentally connected to the web without realising it. The gradual rise of 5G, which also brings a new set of threats, is going to make this a bigger problem because these devices might be spread across a vast geography. As a result, tech teams may well find themselves spending less time at their desks and more time up ladders and poking around and playing find-the-unsecured-device than they are used to. Ransomware is likely to get odder, too. This year has shown just how much effort criminal gangs are willing to put into catching out large organisations. The aim now is to score a huge payday by encrypting whole networks, not just a few PCs.


What Is Account Creation Fraud? Complete Guide to Detection and Prevention

Because of the sheer number of threats that fake account creation is a necessary part of, and also because of the difficulty of detecting it, it can be difficult to assess how widespread the practice is. Some trends are beyond doubt, however. LexisNexis Risk Solutions, a research firm, has been compiling statistics on human-initiated attacks on their Digital Identity Network for a few years now, and these numbers show a worrying rise in the practice of fake account creation. The most recent report, for instance, showed a 13% increase in fraudulent account creation in the first 6 months of 2019, as compared to the last 6 months of 2018. Tellingly, this report also showed that account creation fraud was the only ‘use case’ that saw growth over the study period, with all the other types of attack the firm detected slowly decreasing. The largest-scale fraudulent account attacks over the last year are also a good indication of the varied ways in which these attacks can be performed, and the scale at which they can now be deployed.



Quote for the day:


"Education makes a people difficult to drive, but easy to lead; impossible to enslave, but easy to govern." -- Lorn Brougham