Unauthorized mining has been going on for several years, but it’s only since 2017 that it’s been a major issue. The biggest mining botnets have pulled in millions of dollars. If you get infected by this type of software, you might not notice. It’s technically not malware, except for the fact that it shouldn’t be on your computer. People use the same software for legitimate mining. Security software often can’t detect it, but it costs you in several ways. Unauthorized mining can slow down your computers. Since computers performing active tasks consume more energy than idle ones, it increases your electric bill. If it has bugs, it can make your computers crash more often. It could create security holes and load more nasty software onto your machines. Some kinds of software can grab a computer’s processing cycles for mining without even breaking its security. A controversial Web application called Coinhive runs code on browsers to do mining. When the site owner says so up front and uses it instead of ads to fund the site, it’s generally considered OK.
Many new skills are required for a successful digital transformation, including an understanding of analytics and the perpetual evolution of IT architecture to modernize IT. Insurers are struggling to attract talent, because these and other necessary skills are new, not only in insurance but across industries. Demand for talent in areas like big data is expected to exceed available supply by a factor of four. The resulting talent shortage means that leaders are responsible for establishing an environment that attracts talent, promotes personal growth, and offers a desirable and interconnected work environment and flexibility. Few insurers will become leading tech companies, as they will struggle to attract tech talent at scale on a par with the likes of digital natives. But digital leaders in the insurance sector, having prioritized technological literacy and infused areas like engineering, design, and agile with more talent, better understand the investments and trade-offs required in the digital age.
NIH is a surprisingly common feature of the innovation landscape, and there are many other famous examples. Not least Kodak’s rejection of both Edwin Land’s idea for the Polaroid process and Chester Carlson’s xerography underline how easy it is to put up defenses against ideas originating from outside. NIH is a theme which my colleague Oana-Maria Pop has written a great blog post about, but its persistence makes it worthwhile to take another look. Elting E. Morison gives a wonderful example in his detailed study of “Gunfire at Sea,” which explores the tortuous journey the innovation of continuous-aim gunnery had in finding its way on to the decks of U.S. warships. Back in the late 19th century, naval gunnery was not very accurate. A U.S. Bureau of Ordnance study of one thousand shells fired during an exercise around the time of the Spanish-American war suggested that less than 3 percent were hitting the target. That’s a problem. A long way away in the South China Sea, Admiral Percy Scott of the British Navy was working on the solution. His squadron was doing gunnery practice with similarly poor results
The catalyst for this has been the rise of software-defined WANs (SD-WANs). Their multipath capabilities and network optimization make it possible to use broadband for business connectivity, and organizations of all sizes have been jumping aboard this trend. One of the challenges that remain, though, is that not all connections or types of broadband are created equal. A network manager at a business with branches located nationwide recently told me that his company's preferred broadband type has been cable, but he has been surprised by the variability in the resulting bandwidth by location and time of day. For example, in one large metro area, he is purchasing 50MB of bandwidth, more than adequate for most branches. During the mornings he often sees throughput of 100MB or more. Later in the day, however, when kids get home from school and the Xboxes and Netflix subscriptions start coming online, he has seen his bandwidth drop to as low as 8MB.
AI is the advancement of research technology that brings together many different factors based on the internet habits of the user. In other words, it is the gathering of information on the personal habits of the potential customer so that paid advertising can be better directed at the right audience. There are ethical implications of AI as many feel that it crosses the line in gathering personal information. However, there is no doubt that it has helped improve the targeting of paid advertising efforts. AI does more that break down the demographics, it takes every search you make and every page you visit and turns that into information about your personal habits on the internet. By not only addressing your actions, it also considers your vulnerabilities which means that paid advertising in places like Google AdWords and Facebook PPC becomes more potent in its effects. Over the past few years, AI has had a major impact on the collection, interpretation, and distribution of the information provided to marketers so they can purchase advertising in the right places.
“Contrary to the common belief, I think there is more opportunity for collaboration with fintech than disruption,” he added. “Particularly on the institutional side.” Sultan said Citi is already “very engaged” with the fintech space, scanning “thousands” of startups every year — saying it’s taken an equity position in “about 30” so far. It has established four “innovation centers” in Singapore, Dublin, Tel-Aviv and London to act as its feelers on the fintech scene. And its investments are focused in four key areas, according to Sultan — namely: Client experience; scalability; operating model agility; and innovation. “Pretty much every one of [the fintech startups it’s invested in] have an operating relationship with the businesses,” he continued. “So we are using their technology and integrating into our solutions. And helping them commercialize, appreciate our equity, as well as delivering a better solution to the client. “So I think the philosophical change is you cannot get to the market fast enough on a proprietary basis.”
We are seeing three big drivers. The first is the rapidly decreasing cost and size of cameras. The number of cameras deployed will grow dramatically. Five years ago, where we were putting in half a dozen cameras in to provide an overview of what was going on in the warehouse or factory floor – today we are putting 4 or 5 times as many on the ends of forklift trucks and inside industrial machinery. Wherever an operator doesn't have exactly the view they need, we are going to be placing cameras. This will just keep growing. The second big driver is increasing intelligence in cameras. More and more video analytics and storage are moving to the edge – inside the camera. Customers keep finding new opportunities to have the cameras provide not only images but also actionable data. For us, this is a great opportunity, and we are pushing forward with specialized analytics for industrial applications.
“I think the blockchain is going to do really interesting things around the transfer of value,” writer and academic Rachel Botsman noted at Davos. Even though traditionally much of the chatter in the cryptocurrency communities has centred on disrupting the financial industry, Botsman doubts this technology will altogether result in currency exchanges bypassing the banks. “I don’t understand where that hype is coming from,” Botsman told the Forum's annual meeting. Indeed, many banks, though hardly endorsing the exchange of cryptocurrencies, are certainly waking up to the possibilities of the blockchain, which could eventually allow them to automate many systems that currently require a large in-office staff. As Brian Behlendorf, executive director of Hyperledger, explained to me at the Forum's annual meeting this year: “[Banks] can do a lot of what they do today (...) just faster and cheaper and with greater guarantees that the parties that they’re working with will make good on their commitments.”
The human brain may not be the most efficient form of intelligence; it needs a lot of biological backup machinery to make up for cells that die all the time, and its ability to store data is not as reliable as that of computers. Someday, many years from now, technology will probably exist that will be able to reconstruct the brain while cutting some corners for improved efficiency. But it's unlikely to be able to replicate every nuance of perception, memory, emotion, intuition. We often talk about today's artificial intelligence -- based on algorithms that essentially use the brute force of computers to crunch problems such as image recognition -- as if it'll soon replace humans at complex creative and communicative tasks. That kind of AI, however, will never do it. Progress along the same lines can produce smarter digital assistants than today's Siri or Alexa. But a human, equipped with a computer, will still run circles around them because of the sheer, currently irreproducible complexity of the human brain.
In 2017, Congress passed the 500 million dollar Modernizing Government Technology Act, which is part of defense funding. Although the act promises to play a role in improving lax cybersecurity efforts, it’s also aimed to replace legacy systems with more modern systems, purportedly in an effort to cut back on operating expenses. Although the act, if properly funded, will bring extra attention to cybersecurity, replacing legacy systems also increases the burden of implementing proper security. The act will likely have benefits to users, as modern systems should be able to outperform legacy ones. However, there’s no guarantee that security efforts will be enough to prevent potentially large and damaging cyberattacks. The cost of proper cybersecurity can be expensive, and the slow pace of the government means new technology can be slow to adopt. On the other side, paying for hackers is relatively inexpensive, and even the smallest of security holes can eventually led to massive attacks.
Quote for the day:
"Having more data does not always give you the power to make better decisions." -- Jeffrey Fry