Daily Tech Digest - October 27, 2017

The coming revolution is about an AI understanding the human brain — our preferences, our choices, or desires. That will require a Herculean effort. For one thing, my preferences change. Today I’m thinking about biking apparel, tomorrow I’m thinking about going to the beach. An AI will have to adapt, respond, adjust, and customize a thousand times per day. It will need to work like the human brain, constantly making micro-adjustments based on changing variables. A true AI is one that serves us and knows us; we no longer have to know or serve it. We speak and it hears us. We don’t need to learn its parameters, it will learn our parameters. We’re not there yet, of course. Most of us are still tethered to a smartphone all day. By 2030 or so, bots will become adaptive assistants that learn about our behaviors and fit smoothly into our daily routine. We’ll stop being enamored by tech. 


The push toward comprehensive endpoint security suites

The push toward comprehensive endpoint security suites
In a recent research project, ESG asked 385 security professionals the following question, “As new endpoint security requirements arise and your organization considers new endpoint security controls, which of the following choices do you think would be most attractive to your organization?”  The results were quite interesting, as 44 percent of respondents said they would choose a comprehensive endpoint security suite from a “next-generation” vendor, 43 percent said they would choose a comprehensive endpoint security suite from a single established vendor, 8 percent said they would choose an assortment of endpoint security technologies from different vendors, and 3 percent said they would choose an assortment of endpoint security technologies from vendors that establish technical partnerships for integration.


Science may have cured biased AI

machine learning
Scientists at Columbia and Lehigh Universities have effectively created a method for error-correcting deep learning networks. With the tool, they’ve been able to reverse-engineer complex AI, thus providing a work-around for the mysterious ‘black box’ problem. Deep learning AI systems often make decisions inside a black box – meaning humans can’t readily understand why a neural-network chose one solution over another. This exists because machines can perform millions of tests in short amounts of time, come up with a solution, and move on to performing millions more tests to come up with a better solution. The researchers created DeepXplore, software that exposes flaws in a neural-network by tricking it into making mistakes. Co-developer Suman Jana of Columbia University told EurekAlert:


FILE PHOTO: An attendant holds a bitcoin sign during the opening of Hong Kong's first bitcoin retail store February 28, 2014. REUTERS/Bobby Yip/File Photo
Opening a Bitcoin wallet is just one contingency plan firms can make to prepare for cyber breaches in which client data is stolen, according to John Sweeney, president of IT and cyber security advisors LogicForce. This can be a useful "last resort" when the data is not backed up and cannot be restored unless a ransom is paid. "The firms doing this are smarter," said Sweeney, and are looking to take "conscientious" proactive, rather than reactive, steps. Sweeney stressed he did not generally advocate paying ransoms, but said it "makes sense" for firms to have a Bitcoin wallet to hand. "I certainly don't see it as a bad move," he said. Data breaches at law firms are a growing concern: confidential information, often sent in unencrypted emails, risks being stolen and ransomed back to firms, used for fraud or sold to third parties to be used in crimes such as insider trading.


In actual fact, banks are now competing against every firm in the world that delivers a powerful, positive and engaged digital experience for their customers. If we take customer-centric innovators like Amazon, Netflix, Google and Facebook, and examine what sets them apart from the competition, we see it’s their ability to experiment, scale and deliver new features and functionality almost on a constant basis. And how do they manage this? They leverage the full capabilities and flexibilities that cloud technologies can offer.  It is this shift that is responsible for the banking world now embracing digital transformation. Once the realm of retail banking, digital transformation is now entering the unchartered territories of front, middle and back office operations of commercial, investment, business and private banks.


The #1 IOT Challenge: Use Case Identification, Validation and Prioritization

So while we have an amazing compilation of technologies, sensors, gateways, connected devices and such for capturing data, understanding ahead of time what you are doing to do with that data – and why – is important because it frames what technologies, architectures, data, analytics and applications the organization is going to need in order to “monetize” IOT. So before you jump into the IOT pond, let’s make sure that there are no logs, boulders or sea monsters waiting for you. Let’s start our IOT journey by first creating an “IOT Business Strategy.” ... There is a bounty of business use cases from which the business can choose in order to monetize their IOT efforts. However this bounty of use cases is both a gift and a curse because the best way to ensure that you don’t successfully complete any use case is to try to do them all.


Will Machine Learning Make You a Better Manager?


“If you are a credit card processor and you have everyone’s transactions, you could predict whether a particular customer is going to run themselves into debt and default in the future.” Machine learning is even being used to learn more about machines, says Teodorescu, who points out that manufacturers are increasingly using algorithms for preventive maintenance. “You can predict when things are going to break down based on prior performance,” Teodorescu says. “That could preempt costly assembly line shutdowns later.” In all of these ways, it’s clear that while machines may not be taking over the world any time soon, machine learning certainly is. “It will become less and less a mysterious thing and more of a regular topic taught in schools in 20 years,” says Teodorescu. “It will be something everyone learns.”


Building Reactive Systems Using Akka’s Actor Model & DDD


The actor model is designed to be message-driven and non-blocking, with throughput as part of the natural equation. It gives developers an easy way to program against multiple cores without the cognitive overload typical in concurrency. Let’s see how that works. Actors consist of senders and receivers; simple message-driven objects designed for asynchronicity. Let's revise the ticket counter scenario described above, replacing a thread based implementation with actors. An actor must of course run on a thread. However, actors only use threads when they have something to do. In our counter scenario, the requestors are represented as customer actors. The ticket count is now maintained with an actor, and it holds the current state of the counter. Both the customer and tickets actors do not hold threads when they are idle or have nothing to do, that is, have no messages to process.


Microsoft's open source sonar tool helps developers find security flaws in their websites

Beyond open sourcing the code, Microsoft donated the project to the JS Foundation over the summer to make it more accessible to all. Microsoft intended for sonar to "avoid reinventing the wheel," Molleda wrote, instead tapping and integrating existing tools and services that help developers build for the web. With that being the case, sonar integrates with aXe Core, AMP validator, snyk.io, SSL Labs, and Cloudinary. The tool could make a real difference for developers in terms of producing higher quality websites: A recent Northeastern University analysis of over 133,000 websites found that 37% had at least one JavaScript library with a known vulnerability. As ZDNet noted, Snyck also ran a scan of the top 5,000 URLs earlier this year, and found that more than 76% were running a JavaScript library with at least one vulnerability as well.


Sony’s big bet on 3D sensors that can see the world

The new 3-D detectors are in a category called time-of-flight sensors, which scatter infrared light pulses to measure the time it takes for them to bounce back. The basic technology has been around for a while and forms the basis for the Xbox’s motion-based Kinect, as well as laser-based rangefinders on autonomous vehicles and in military planes. Sony’s big innovation over existing TOF sensors is that they’re smaller and calculate depth at greater distances. Used with regular image sensors, they effectively give machines the ability to see like humans. “Instead of making images for the eyes of human beings, we’re creating them for the eyes of machines,” Yoshihara said. “Whether it’s AR in smartphones or sensors in self-driving cars, computers will have a way of understanding their environment.” The most immediate impact from TOF sensors, which will be fabricated at Sony’s factories in Kyushu, will probably be seen in augmented-reality gadgets.



Quote for the day:


"Education's purpose is to replace an empty mind with an open one." -- Malcolm Forbes


Daily Tech Digest - October 26, 2017

You might soon be able — if you're so inclined — to join a bonefide church worshiping an artificially intelligent god.  Former Google and Uber engineer Anthony Levandowski, according to a recent Backchannel profile, filed paperwork with the state of California in 2015 to establish Way of the Future, a nonprofit religious corporation dedicated to worshiping AI. The church's mission, according to paperwork obtained by Backchannel, is "to develop and promote the realization of a Godhead based on artificial intelligence and through understanding and worship of the Godhead contribute to the betterment of society." ... Levendowski's pitch for an AI church comes amid apocalyptic warnings from tech and science luminaries like Elon Musk and Stephen Hawking to the dangers of artificial intelligence.


India Warily Eyes AI

The IT industry may employ only a few million of India’s 1.3 billion people—but it has been a beacon for young men and women with aspirations. It motivated families to send their children to university, placed graduates in gleaming campuses, conferred independent urban lifestyles upon them, and provided stable incomes and access to the world outside India. Over the last 30 years, moreover, it has been the only industry in India to begin from seed and bloom to such success. India is otherwise struggling to create jobs: 12 million Indians enter the workforce every year, but only 135,000 jobs in the formal economy’s eight biggest sectors—including IT—were created in 2015. A dramatic contraction of the IT industry—a dimming of the beacon—would jolt the country’s economy and polity deeply.


Use of IoT in corporate networks is soaring

wireless network - internet of things edge [IoT] - edge computing
Even the omnipresent issue of IoT security seems to be less frightening to companies than before – just 7% of those with bigger (10,000+ devices) deployments said that security was their top concern, although most still acknowledged its importance. Earlier editions of the study cited security as a top concern among up to 29% of companies. Roughly two-thirds of all respondents said that their IoT deployments were “mission-critical” and admitted that a security breach would be catastrophic. Even though the study found growth largely across the board, some verticals saw particularly strong uptake – retail, transportation and energy all grew at better than 17% year-on-year, while other sectors – including those like healthcare and automotive, where IoT has been popular for somewhat longer – continued to grow at a respectable 9% and 12%, respectively.


Agility, comradery drive CA Technologies' strategy turnaround

"There is nothing stronger than pitching something that shows a customer you are in the same boat as they are. It demonstrates you have a deeper appreciation for what they are going through," said Ayman Sayed, ... Moreover, CA Technologies' strategy wraps its portfolio of agile, DevOps and security products around its software development cycle, with a blueprint in hopes to offer a better overall software development environment. CA will also offer technical support through the planning, building, testing and deployment stages to get customers more familiar with some of these newer technologies. One IT professional with a technology services company who worked with CA as part of a digital transformation project said CA's training and technical services helped speed his company's transition to improve its overall operations' performance and agility.


So You Want to Be a Data Scientist? – It’s Complicated

So You Want to Be a Data Scientist? – It’s Complicated
Anyone who is considering a career in data science needs to understand first, the myriad of things such a career involves, the type of education and training required, and exactly what the job market holds. And because the field is growing so fast, students and mid-career professionals both have an opportunity to move into data science careers, if they get the right education and training. There is no single definition of data science, as it varies with industry, specific business, and what the purpose of the data scientist’s role is. And different roles require different skill sets, therefore the educational and training path is not uniform. Data scientists can come from many fields – math, statistics, computer science, and even engineering. But the role the scientist is to play is now generally broken down into two large categories


Edge Analytics – What, Why, Who, When, Where, How

Descriptive analytics focuses on what happened, diagnostic analytics relays why it happened, predictive analytics previews what is likely to happen and prescriptive analytics conveys options on what you should do about it. But you’ll be missing out on an exciting area called Edge Analytics if you relied solely on this type of classification. Let’s look at the scenario of an offshore oil rig which has hundreds of sensors collecting data but miles away from any decent data center to process and analyze this data. What if the sensors had access to decentralized process systems that could perform data analytics and possibly shut off a faulty valve right then and there based on the diagnosis and prediction? Wouldn’t that be more efficient than sending all that sensor data back to central data centers miles away and relaying back the same information much later? Yes, that’s where edge analytics comes in.


How CIOs can Help Improve Enterprise Agility

We can cite the benefits of the ability to react more quickly to change, the delivery of business value on an incremental basis, and the improved satisfaction customers have with our products due to their day-to-day involvement. We have also seen greater morale and increased employee engagement among our development team members. But a company cannot become agile simply by expanding the use of sprints, stand-up meetings, and burndown charts into every department. ... The extent of your company’s agility is determined by the degree to which you can change course, and the speed at which you can achieve this change. Marc Benioff, CEO of Salesforce.com, declared at Davos that “speed is the new currency of business.” But if your momentum prevents you from quickly changing direction, you may find yourself moving swiftly to the wrong destination. The increasing unpredictability of our world demands more and more agility.


Bad Rabbit: Ten things you need to know about

Russian cybersecurity company Group-IB confirmed at least three media organisations in the country have been hit by file-encrypting malware, while at the same time Russian news agency Interfax said its systems have been affected by a "hacker attack" -- and were seemingly knocked offline by the incident. Other organisations in the region including Odessa International Airport and the Kiev Metro also made statements about falling victim to a cyber-attack, while CERT-UA, the Computer Emergency Response Team of Ukraine, also posted that the "possible start of a new wave of cyberattacks to Ukraine's information resources" had occurred, as reports of Bad Rabbit infections started to come in. At the time of writing, it's thought there are almost 200 infected targets and indicating that this isn't an attack like WannaCry or Petya was -- but it's still causing problems for infected organisations.


Doubling Up on AV Fails to Protect 40% of Users from Malware Attacks

Nearly 40% of users who had multiple, traditional antivirus solutions loaded on their endpoints faced a malware attack during the first half of the year, a Malwarebytes report revealed today. The Mapping AV Detection Failures report, which scanned nearly 10 million endpoints, found a number of malware attacks occurred despite having two or more traditional, or signature-based, antivirus solutions installed. "The takeaway for enterprises is [that] the most basic threats have not been caught by the AV they have deployed," says Marcin Kleczynski, Malwarebytes CEO. "Yet, they continue to use these and grow desensitized." He adds CISOs and other IT security leaders may be adopting a common assumption that no one ever gets fired for using antivirus software from the industry leaders, especially when analysts rate them high on the effectiveness scale in comparative reports.


The impact of threat hunting on your security operations

In general, threat hunting can be most efficiently implemented by organizations that already have a solid, mature information security operations center (SOC) and computing incident response team (CIRT). While the latter two are holding the fort (so to speak), threat hunters are free to cast a wider net.Threat hunting starts with the assumption that an incident has happened, but it’s not based on already received alerts. It can be based on findings from previous hunts, or information from outside the organization. Effective hunts depend more on the knowledge, skills and instincts of human analysts than on tools. It is generally acknowledged that, while senior SOC analysts and incident responders can have the right foundation for threat hunting, in order to be good threat hunters they also have to be able to think creatively and see the big picture. Still, there can be no doubt that good tools can help threat hunters channel their capabilities more efficiently.



Quote for the day:


"More people would learn from their mistakes if they weren't so busy denying them." -- Harold J. Smith


Daily Tech Digest - October 24, 2017

mastercard-blockchain.jpg
The Mastercard blockchain is a permissioned blockchain, which will allow participants to maintain the distributed ledger without sacrificing scalability or performance, Sota explains in the video. ... "Our blockchain technology can be used for clearing in near real-time card payment transactions eliminating consolidation and improving settlement," he said. According to Mastercard, its technology boasts four key differentiators to others in the space, spanning privacy, flexibility, scalability, and the reach of the company's settlement network. Mastercard said its blockchain provides privacy by ensuring that transaction details are shared only amongst the participants of a transaction while maintaining a fully auditable and valid ledger of transactions, but still allowing partners to use the blockchain APIs alongside other Mastercard APIs


IT, OT, IoT: Does Hitachi Have a Dictionary for This Alphabet Soup?

Image: Pixabay
At one end of the spectrum, and most notably in this “industrial reinvested as software” class, is GE. GE, better known for building gas plants, jet engines and wind turbines, is reinventing itself as a software company. Under former CEO Jeff Immelt, and current head of all things digital, Bill Ruh, the company is investing hundreds of millions of dollars to build capability in the software space. GE is applying its Predix software offering both to its own business units but, more importantly, is attempting to become the software provider of choice for a host of third-party industrial organizations. At the other end of the spectrum lie the traditional technology vendors who, despite not having significant industrial experience themselves, have long histories of delivering technologies to industrial operations.


Architecture Patterns to Consider When Designing an Enterprise Data Lake

architecture_patterns_enterprise_data_lake-10
Virtually every enterprise-level organization requires encryption for stored data, if not universally, at least for most classifications of data other than that which is publicly available. All leading cloud providers support encryption on their primary objects store technologies (such as AWS S3) either by default or as an option. Likewise, the technologies used for other storage layers such as derivative data stores for consumption typically offer encryption as well. Encryption key management is also an important consideration, with requirements typically dictated by the enterprise’s overall security controls. Options include keys created and managed by the cloud provider, customer-generated keys managed by the cloud-provider, and keys fully created and managed by the customer on-premises.


Why Tech Giants See Singapore As The Next AI Hub


Singapore-based Marvelstone on Monday dovetailed the announcement by the Chinese conglomerate – owner of the South China Morning Post – by revealing it was setting up an AI hub of its own in the city state, which would incubate 100 start ups every year. It said its hub would be “the world’s biggest” when it opens next year. ... The government also showed it is serious about the country’s AI prospects when it announced the development of a dedicated data science consortium, and pledged Sg$150 million to industry research. In the Lattice80 complex, located in Singapore’s central business district, Ko said he was confident the government would follow through with its pledge to foster the industry. “Firstly, it’s about diversity … other Asian cities like Tokyo are also trying to be AI hubs, but they are more homogenous. Singapore’s advantage is that it is welcoming to all, and there is strong government support,” he said.


The prevalence of AI-powered IoT devices inspires mixed emotions

The easy path forward would be to continue developing connected devices without taking people’s fears into consideration. However, this is both unethical and unadvisable from a practical standpoint. Unsecured devices put multiple parties at risk, from the person using the product to the company pulling data from it. A better approach to the situation lies in analyzing the strengths, weaknesses, opportunities, and threats AI- and IoT-enabled devices offer. This will require addressing such pain points as IoT standards, privacy measures, and security. It could also involve education, job training, and general change management. But whether we’re looking at something as mundane as faster streaming or as grand as smart cities, the internet of things — when bolstered by artificial intelligence — has potential to impact every aspect of our lives.


Three Things Data Scientists Can Do To Help Themselves And Their Organizations

In the brave new world of business analytics fueled by big data, there has been significant discussion about the evolving roles of C-suite executives, including the CEO, CTO, and CIO. That discussion is now expanding to include the CMO plus the new roles of CDO and CDS. I do not have an MBA and I usually don’t undertake risky behavior, such as telling a CEO how to run her or his business. However, it is entirely appropriate for the CMO, CDO, and CDS to step up to the challenges of leading and directing the analytics, big data, and data science efforts of their organization, respectively. It is also appropriate for these execs to stand firm against corporate cultures and naysayers that resist big data analytics projects with these types of remarks: a) “Let’s wait and see how it develops elsewhere”; b) “We have always done big data”; or c) “What’s the ROI? Show me the numbers.”


The cryptoeconomics of scaling blockchains


A key shortcoming of the current generation of blockchain technologies is their limits when it comes to performance and scalability. For instance, the entire Bitcoin network can only handle seven transactions per second, compared with over 2,000 transactions per second on the VISA network and millions of transactions per second handled by any top tier consumer application. That has made it impossible for the current generation of blockchain networks to handle big data applications. Is the poor performance of blockchains an engineering problem? It is not, at least not entirely. The problem is actually inherent to the incentive-driven design of blockchains, known as cryptoeconomics. Incentives in Bitcoin consensus Blockchain is useful because it allows untrusted and non-corporative parties to work together and maintain a system. Let’s look at the example of the Bitcoin network.


Stuck between Design Thinking and Lean Startup? Take a hybrid approach


There are now so many different kinds of innovation: design innovation, business model innovation, digital innovation. And so many ways to organizefor innovation: innovation labs, innovation centers, corporate accelerator programs. More significantly, there has been a growth of two schools of thought in corporate innovation: Design Thinking and Lean Startup. Suddenly corporate innovators feel the need to be trained in both. But many consultancies practice or train in only one. And wherever corporate innovators sit, there is growing pressure to be more entrepreneurial. More agile. To increase speed to market. To be more like that startup accelerator your boss visited. ... The best way to tackle this would be to learn about these new approaches, test them on real innovation projects, and then adapt them so that they’re really practical and work in corporations.


What Are The Security Threats For The Cloud

What Are The Security Threats For The Cloud
Surprisingly, although cloud security is so important seeing the different data breaches we have seen around the globe, over 40% of the IT managers have no plans of purchasing ‘security-as-a-service’ solutions. This raises the question, how well such companies are prepared for a future where cloud becomes more and more important as well as criminals are targeting cloud solution on a wider scale. The security of the data in your cloud is vital for companies. Being hacked can have serious consequences for a company as well as on a personal level, seeing the Target CEO who was fired after a data breach. Once your cloud is hacked, your company has a serious issue, depending on the severity of the hack. Therefore it is wise to be aware of the security issues when dealing with the cloud. This infographic might help to achieve that.


Tech Giants Are Paying Huge Salaries for Scarce A.I. Talent


At the top end are executives with experience managing A.I. projects. In a court filing this year, Google revealed that one of the leaders of its self-driving-car division, Anthony Levandowski, a longtime employee who started with Google in 2007, took home over $120 million in incentives before joining Uber last year through the acquisition of a start-up he had co-founded that drew the two companies into a court fight over intellectual property. Salaries are spiraling so fast that some joke the tech industry needs a National Football League-style salary cap on A.I. specialists. “That would make things easier,” said Christopher Fernandez, one of Microsoft’s hiring managers. “A lot easier.” There are a few catalysts for the huge salaries. The auto industry is competing with Silicon Valley for the same experts who can help build self-driving cars. Most of all, there is a shortage of talent, and the big companies are trying to land as much of it as they can. Solving tough A.I. problems is not like building the flavor-of-the-month smartphone app.



Quote for the day:


"Nothing so conclusively proves a man's ability to lead others as what he does from day to day to lead himself." -- Thomas J. Watson


Daily Tech Digest - October 23, 2017

Internet of things illustration
Companies are starting to cast the net farther afield, taking on graduates from a far wider range of disciplines. Virtusa often looks for people with a background in the arts, says Gabrault, because alongside their analytical skills they are creative and can play a key role in user experience, and make sure a product is actually something people want to interact with. Teamwork is also important. IoT is not about beavering away on solo projects, but involves interaction with other teams, end users and customers. “Candidates need to show that they can empathise with the client,” adds Owen. Helping students become “work-ready” is one of the driving forces behind Fast Track, a programme run by the Future of British Manufacturing. It matches students from some of the UK’s leading universities with companies, to help them develop their next big innovation or connected product.


Demystifying The Dark Science Of Data Analytics

Demystifying data analytics: How to create business value with data
Deeper analytics knowledge can also help IT leaders understand why the approach often seems so mysterious. "Data science, in its best form, is an extremely creative endeavor," Johnston says. "There is not necessarily a need for managers to understand the internals of every analysis, just as owners of a software project need not understand the underlying technological internals." ... Unlike IT, where solutions are often obvious and widely adopted by enterprises worldwide, analytics processes are frequently unique and individualized. "Choosing the best analytical method is sometimes straightforward, sometimes art," Magestro says. "For example, looking for cause-effect relationships in data usually means some kind of regression, and looking for similar characteristics in large customer datasets likely involves clustering algorithms."





Select Your Agile Approach That Fits Your Context


By definition, the team finishes the work at the end of that time. The PO decides if any unfinished work moves to the next iteration or farther down the product roadmap. If your team uses iterations as in Scrum, the iteration starts with the ranked backlog and ends with the demo and retrospective. If your team uses flow, you can demo and retrospect at any time. To be fair, iteration-based agile approaches don’t prevent you from demoing or retrospecting at any time. ... Teams might have trouble finishing stories in a timebox or iteration. There can be any number of reasons for their trouble. Here are three common problems I’ve seen: the stories are too large; the people are multitasking on several stories or worse, projects; and the team is not working as a team to finish stories. If the team can’t finish because of multitasking, a cadence might make that even worse. However, visualizing their work might make a difference.



APIs Need to Be Released, Too!

Would it come as a surprise to hear that at the core of each and every one of these priorities are APIs and DevOps? So, just what is an API? API stands for Application Programming Interface and it’s a highly common software development term ­– an initialism you’re bound to have come across. In some form or another, development has always relied on interfaces. Without going too deep, APIs are primarily concerned with enabling communications between ‘private’ and ‘public’ interfaces. Private interfaces are used internally between individual developers and development teams. These aren’t accessible to third parties and can be changed as often as required. This is in stark contrast to public interfaces, which are exposed to third parties – be they internal or outside the company – and shouldn’t change often as other services using these interfaces may break or stop functioning.


Quantum physics boosts artificial intelligence methods


A popular computing technique for classifying data is the neural network method, known for its efficiency in extracting obscure patterns within a data set. The patterns identified by neural networks are difficult to interpret, as the classification process does not reveal how they were discovered. Techniques that lead to better interpretability are often more error-prone and less efficient. “Some people in high-energy physics are getting ahead of themselves about neural nets, but neural nets aren’t easily interpretable to a physicist,” said USC’s physics graduate student Joshua Job, co-author of the paper and guest student at Caltech. The new quantum program is “a simple machine learning model that achieves a result comparable to more complicated models without losing robustness or interpretability,” Job said.


How Close Are You Really?


The network of links between individuals—their social network—has long fascinated social scientists. These networks are neither random nor entirely ordered. Instead, they occupy a middle ground in which people are strongly linked to a few individuals they know well, with weaker links to a larger group of friends and coworkers plus extremely weak links to a wide range of casual acquaintances.  Social scientists measure the strength of these links using a variety of indicators, such as how often a person calls another, whether that call is reciprocated, the time the two people spend speaking, and so on. But these indicators are often difficult and time-consuming to measure. So network theorists would dearly love to have some way of measuring the strength of ties from the structure of the network itself.


 The Future of Enigma and Data


In practice, to build a data marketplace, the Enigma protocol needs to implement the infrastructure for a decentralized database, with storage and computational abilities that far exceed those that blockchains offer. While all blockchains are, in a manner of speaking, protocols for decentralized computing and data storage, their poor scalability and lack of privacy features limit potential use-cases. We need a second-layer network that can handle more data, faster, and can provide better privacy features — and that’s where the Enigma protocol comes in. Our protocol is based on the ideas presented in the 2015 Enigma whitepaper, as well as in our subsequent work (paper, thesis). It aspires to complement a blockchain (of any kind) with an off-chain data network (essentially — a single, always-on decentralized database), in much the same way that payment networks (e.g., Raiden) offer better financial transactions scalability.



Could Your Reactive Cyber Security Approach Put You Out of Business?

reactive-cyber-security-2
One scenario could involve your organisation becoming the victim of ransomware where an attacker hijacks your data and demands compensation for it. Without paying up, your operations come to a screeching halt, and your revenue plummets overnight. Another would be having sensitive customer or employee information fall into the wrong hands. This can lead to everything from identify theft to corporate espionage. Even basic information, like email addresses, phone numbers and billing addresses can be of significant value to cyber criminals and open a can of worms. You also have to consider the level of disruption that comes along with an attack. Not only does downtime cost your business serious money, it can tarnish your brand reputation, and many customers may end up turning to competitors.


Digital brains are as error-prone as humans

Imagine a future where you are regularly stopped and searched by the police, based simply on bad information fed into a computer. That is the fear of one authority on the subject, who is concerned that human biases and errors are being programmed into machine learning
The algorithms that make up these neural networks can unintentionally boost these biases, giving them undue importance in their decision making. Writing in the WSJ, Professor Crawford said: 'These systems “learn” from social data that reflects human history, with all its biases and prejudices intact.  'Algorithms can unintentionally boost those biases, as many computer scientists have shown. 'It’s a minor issue when it comes to targeted Instagram advertising but a far more serious one if AI is deciding who gets a job, what political news you read or who gets out of jail. 'Only by developing a deeper understanding of AI systems as they act in the world can we ensure that this new infrastructure never turns toxic.' Research has already demonstrated that AI systems trained using such data can be flawed.


The Role of Data in the Financial Sector


What makes the financial sector even more interesting from a big data standpoint is the constant stream of new regulations and reporting standards that bring new data sources and more complex metrics into financial systems. ... The ForEx markets, as mentioned earlier, trade 24 hours per day, from morning in Sydney to evening in New York, except for a small window during the weekend. Additionally, algorithmic trading has been used in the financial markets for a long time in one form or another. The NYSE introduced its Designated Order Turnaround (DOT) system in the early 1970s for routing orders to trading desks, where the orders were executed manually. Now, algorithmic trading systems break very large orders into smaller pieces that are executed automatically based on time, price, and volume, optimized for market parameters.





Quote for the day:

"Defragmenting data silos is key for accelerating research."  -- Joerg Kurt Wegner


Daily Tech Digest - October 21, 2017

HPE bets on hybrid IT to combat public cloud

In a note covering the meeting, financial analyst Berenberg wrote that HPE’s strategy remains focused on three key areas. The first is simplifying hybrid IT using its datacentre technologies, systems software, and private and public cloud partnerships. The second area of focus is to support the so-called intelligent edge. This encompasses its offerings from Aruba in campus and branch networking, and the industrial internet of things (IoT) with products such as Edgeline and the Universal IoT software platform, said Berenberg. The third area of focus is HPE’s advisory, professional and operational services, which Berenberg said will include consumption-based pricing. The analyst said HPE’s decision to stop selling custom-designed commodity servers to the tier-one service providers, while continuing to sell them higher-margin products, was a risky business strategy.


GDPR Requirements for US Companies


An organization may decide not to do business with EU citizens to avoid having to comply with GDPR, but even that decision must be implemented correctly. If you maintain a website that uses cookies, and it can be accessed by EU citizens, GDPR applies. GDPR also applies to organizations of all sizes. It doesn’t matter if you are a small one-person practice or a large organization with thousands of employees. If you collect or process data on EU citizens, GDPR compliance is not optional. GDPR replaces the EU Data Protection Act of 1998, which placed responsibility only on the data controller, not processors of data. If you processed data for another company (the controller) it would be that company that had to comply with past regulations. GDPR applies to both processors and controllers – Both parties are now responsible for protecting the privacy rights of EU citizens.


Data: Lifeblood of the Internet of Things

As IoT matures and we gain more confidence in the technology, we will increasingly use it on more critical applications – like self-driving cars— where errors could lead to serious injury or disruption. This requires us to ensure critical and hypercritical data is prioritised – and in turn it will drive an enormous shift in how systems capture, manage, store, secure, and process information. Analytics, for instance, will increasingly need to happen in real-time and superior analytics will become a competitive advantage.  IDC estimates that by 2025, about 20 percent of all data will be critical—meaning necessary for the continuity of daily life--and nearly 10 percent of that will be hypercritical, or directly impacting the health and wellbeing of users. Not all data is equally important, but the amount of hypercritical data generated by IoT is accelerating dramatically.


GetStream.io: Why We Switched from Python to Go


Go is extremely fast. The performance is similar to that of Java or C++. For our use case, Go is typically 30 times faster than Python. ... For many applications, the programming language is simply the glue between the app and the database. The performance of the language itself usually doesn’t matter much. Stream, however, is an API provider powering the feed infrastructure for 500 companies and more than 200 million end users. We’ve been optimizing Cassandra, PostgreSQL, Redis, etc. for years, but eventually, you reach the limits of the language you’re using. Python is a great language but its performance is pretty sluggish for use cases such as serialization / deserialization, ranking and aggregation. We frequently ran into performance issues where Cassandra would take 1ms to retrieve the data and Python would spend the next 10ms turning it into objects.


Survey shows most workers misunderstand cybersecurity

The report concluded by advocating employee education programs in order to create a culture of cybersecurity. However, keeping users informed about the current threat landscape isn't the difficult part of establishing appropriate cybersecurity. That award goes to keeping them focused on their behavior so they can understand how to make the right choices. As a system administrator I'm a big proponent of automation wherever possible and this includes patching/securing systems and devices. Proper automation can apply updates and restart systems automatically without establishing a reliance on the end users, run in the background in the form of anti-spam filters and antimalware software, or enforce policies intended to protect devices such as by mandating passwords or utilizing encryption.


How AI can help you stay ahead of cybersecurity threats

artificial intelligence face on top of computer grid
Barclays Africa is beginning to use AI and machine learning to both detect cybersecurity threats and respond to them. “There are powerful tools available, but one must know how to incorporate them into the broader cybersecurity strategy,” says Kirsten Davies, group CSO at Barclays Africa. ... AI and machine learning also lets her deploy her people for the most valuable human-led tasks. “There is an enormous shortage of the critical skills that we need globally,” she says. “We've been aware of that coming for quite some time, and boy, is it ever upon us right now. We cannot continue to do things in a manual way.” ... San Jose-based engineering services company Cadence Design Systems, Inc., continually monitors threats to defend its intellectual property. Between 250 and 500 gigabits of security-related data flows in daily from more than 30,000 endpoint devices and 8,200 users


Data Science, IT And Your Digital Transformation

As data science becomes more commonplace in enterprise environments, businesses are gaining a better understanding of how to deliver the experiences customers crave. Even so, when my company, DataScience.com, commissioned Forrester to survey more than 200 businesses last year, it found that only 22% were leveraging big data well enough to get ahead of their competition. Standing in the way of these companies on the road to a digital-first approach are challenges inherent to large-scale data management, governance and access. If handled incorrectly, data management and infrastructure problems can cripple a digital transformation; however, a great data management strategy will clear the way for data science success. Imagine a scenario in which your data science team predicts, with a high level of accuracy, how much money your customers are likely to spend with your business in the next three months and then delivers that information to you in the dashboard of a tool you already use.


Apache Kafka and the four challenges of production machine learning systems

Hopscotch
The nature of machine learning and its core difference from traditional analytics is that it allows the automation of decision-making. Traditional reporting and analytics is generally an input for a human who would ultimately make and carry out the resulting decision manually. Machine learning is aimed at automatically producing an optimal decision. The good news is that by taking the human out of the loop, many more decisions can be made: instead of making one global decision (that is often all humans have time for), automated decision-making allows making decisions continuously and dynamically in a personalized way as part of every customer experience on things that are far too messy for traditional, manually specified business rules. The challenge this presents, though, is that it generally demands directly integrating sophisticated prediction algorithms into production software systems.


Machine Learning And The Future of Finance

Some brokerage and banking functions can be streamlined as investment advisers are cued to sales opportunities by software that considers customers’ past trades and investment preferences. Credit verification already uses a high degree of automation and could be made even more accurate by considering more complex patterns of the borrower and adjusting the weights to these factors based on changes seen in customer behaviour. All these software tools are used within the world of human activities that have consequences for individuals and whose rules are set by humans. For the immediate future, humans will need to give oversight to computer activities and continue to handle the exceptional situations. This is especially true since the reasons a program makes a decision are very different from the heuristics used by humans.


5 Pitfalls Of Self-Service BI

To get value out of BI tools, business units need to feed them data. In general, this means business units stand up and manage their own data marts — subsets of data warehouses that contain data specific to a business line. Because individual business units are typically responsible for all the hardware, software, and data that comprise their data marts in a self-service BI environment, those business units will inevitably create their own definitions and metrics. That's not such a big problem if that business unit is the only user of the data, but it becomes a large problem when trying to compare reports from different business units. "You've gone from a central model where there was tight control of the business metrics, and you've put that in the hands of the masses and it creates conflicting definitions," says Mariani. Mariani notes that during his tenure at Yahoo, the company's business units had myriad definitions of ad impressions and visits.



Quote for the day:


"When human judgment and big data intersect there are some funny things that happen." -- Nate Silver


Daily Tech Digest - October 16, 2017

Blockchain can fix the sorry state of the real estate industry
A number of blockchain startups are working on tokenizing real estate ownership to overcome these challenges and open real estate investment to more people. An example is BitProperty, a platform that enables property owners to register their property on the blockchain and issue tokens, digital currencies that represent a share of their property. When a person wants to invest in the property, they can purchase any number of its corresponding tokens on BitProperty. Contractors and construction companies can use the BitProperty to raise funds for their projects by launching initial coin offerings (ICO). Anyone who wants to invest in the project can purchase the project’s tokens. In return, they’ll have proportional share of the value and revenue of the finished project in the future.


For a time, Linux OS computers were expected to become the dominant third player in the PC market, Huang said. For example, many thought Linux would be the ideal OS for netbooks, which peaked in popularity in the late 2000s/early 2010s. But Microsoft captured that market by “bottom-ending” Windows to run on the low-powered portables. There’s still talk that Linux could emerge as the dominant OS for thin clients. But IDC’s data doesn’t support that belief. Linux comprised a 3% share of global PC shipments in 2013, but it’s held steady at a mere 1% since 2015 and is expected to stay at 1% through 2021. By comparison, Chrome OS has risen from 1% of the market in 2013 to its current position of 5.5% in 2017, and IDC expects it to reach 8% by 2021.



Cybersecurity: into the data breach
The vulnerabilities stakeholders face include cyber security, data privacy, data breaches, and payments fraud. The utmost vigilance is required to protect organisations against cyber attacks and all stakeholders, including regulators, must be more proactive regarding cybersecurity, with ownership of the issue taken to prevent attacks. In the new payments ecosystem, third-party developers can directly interact with a partner banks’ customers, raising questions about data privacy and security. In an increasingly networked ecosystem, identifying the source of attack will be a challenge. Verizon’s 2017 Data Breach Investigations Report found that security incidents and data breaches affect both large and small financial organisations almost equally. However, the security of larger banks is difficult to compromise as they invest more in cyber security solutions. Smaller banks, which do not have the same access to resources, are more prone to cyberattacks.


A soon-to-be published study shows how the traditional corporate human resources operation actually hampers cybersecurity hiring against a backdrop of the industry's well-documented talent gap. The Jane Bond Project report, commissioned by security talent recruiting firm CyberSN, found that in addition to the lack of available talent for those positions, respondents say their HR generalists are not equipped to recruit and hire cybersecurity talent, and that flawed salary data complicates their ability to issue the best job offers. More than 80% of the 83 cybersecurity positions studied in the report ended up with compensation offers higher than the salary caps stated in the original job descriptions. Half of the 52 organizations participating in the study say they had to up the compensation offers to seal the deal. The positions in the study include security engineers, product sales engineers, incident response analysts, SOC analysts, and product security experts.



REST-API-LEGACY-API
Obviously, no company is going to replace all of their hardware overnight, as it would require considerable expense, implementation and architecture challenges that, until resolved, could impact company operations. In addition, there would be plenty of non- technical issues, like employees knowing device X and networkOSY like the palm of their hands and not looking forward to the time it might take to learn new technology and processes. When a company decides to transform to a software defined networking infrastructure, they may not get support from their existing Network hardware vendor, which might beenjoying hefty margins in network hardware sales, not thrilled to push a technology that will make their expensive boxes replaceable for cheap vendor agnostic white boxes.


According to Badman, Extreme's Automated Campus initiative shows great promise due, in part, to 802.1aq shortest path bridging, which supplants routing protocols such as Border Gateway Protocol (BGP), MPLS and Open Shortest Path First (OSPF), thereby reducing complexity. The new network fabric also includes hypersegmentation to contain security breaches, APIs to increase interoperability, and user and device policies that drive automated network changes in conjunction with analytics and changes on the edge. Badman said he views Avaya as one of the leaders of software-defined networking fabrics, adding that Extreme has succeeded in integrating Avaya fabrics since it acquired the vendor. "I'm of the opinion that some vendors are trying to figure out how to proceed with network-wide fabric methods, while painting beta-grade efforts up with glitz and catchy slogans. This just isn't the case for Extreme," he wrote.


Will the Internet of Things rewrite the rules on cyber security?
Despite having capable teams of programmers and rigorous testing procedures, many companies – be they retailers, manufacturers, or service providers – still have a hard time seeing the potential vulnerabilities in their own systems. “There are a lot of companies who think ‘this will never happen’ and then they come back to us six months later saying ‘it happened’,” says Kupev. The challenge, he explains, is being able to look at things from a different point of view. “Often a client’s view of things can be quite narrow because they’re used to looking at things from the same perspective,” he adds. “Our job is to help them look at matters from a different angle and uncover vulnerabilities they would have otherwise missed.” To illustrate his point, Kupev tells the story of an engine maker that invested heavily in ensuring a device’s “regular” communications systems are secure.


The ability to decrypt packets can be used to decrypt TCP SYN packets. This allows an adversary to obtain the TCP sequence numbers of a connection, and hijack TCP connections. As a result, even though WPA2 is used, the adversary can now perform one of the most common attacks against open Wi-Fi networks: injecting malicious data into unencrypted HTTP connections. For example, an attacker can abuse this to inject ransomware or malware into websites that the victim is visiting. If the victim uses either the WPA-TKIP or GCMP encryption protocol, instead of AES-CCMP, the impact is especially catastrophic.Against these encryption protocols, nonce reuse enables an adversary to not only decrypt, but also to forge and inject packets. Moreover, because GCMP uses the same authentication key in both communication directions, and this key can be recovered if nonces are reused, it is especially affected.


clicks pageviews traffic denial of service ddos attack 100613842 orig
Metered DDoS pricing used to be more common, said Theresa Abbamondi, director of product management for Arbor Cloud and Services at Arbor Networks, Inc. That created a risk for customers, she said. Arbor has been pricing based on clean traffic when it launched its service four years ago, one of the first vendors to do so. "Most of the purpose-build anti-DDoS vendors quickly moved to this type of clean traffic pricing model, and it became the standard in the high end of the market," she said. "Among vendors like Cloudflare, who sell DDoS as an add-on service to a customer base more interested in the vendor’s core offerings, it’s still common today to see vendors limiting the total bandwidth of traffic they will scrub, blackholing traffic that exceeds that threshold, or hitting the customer with exorbitant, hidden fees," said Abbamondi.


New derived credential technology eliminates the need for a physical card by placing verified identity credentials directly and securely onto the mobile device, much as mobile-pay systems do away with the need to make payments using a plastic credit card. This technology offers the added benefits of making identity verification more convenient, and preventing unauthorized logins. But derived credentials and authentication tools such as biometrics offer only a one-time, “snapshot” form of user verification. Once the user has passed the initial test and gained access, the device and everything on it become fully available for viewing and use. Behavioral analytics promises to change this paradigm.  ... tools designed to capture how a device is used can provide the equivalent of a continuously-authenticating security “video,” to detect interlopers, transaction by transaction.



Quote for the day:



"Anger is the feeling that makes your mouth work faster than your mind." -- Evan Esar


Daily Tech Digest - October 15, 2017

Data governance stumble

Data governance involves data quality, ownership and security, metadata, and analytics processes. In most organizations, the word “governance” tends to throw off staff, who can become confused with what data governance entails in the organization and what their specific role is. In order to clear up the role of data governance in the business, it should be defined more in terms of data quality and how higher quality data can advance the efficiency of the business. High data quality should be the fundamental aim for any data governance campaign, and it should be the key area of focus. In fact, research by Gartner showed that poor data quality cost organizations an average of $8 million a year.
Challenges to the proper implementation of data governance Many businesses fall prey to spending too much time on defining a data governance model, such that they end up hindering their organization from becoming data driven.


How to shape your customer experience with big data


It’s clear that the volume of data is increasing, and shows no signs of slowing down. As such, brands will have to move beyond using data just to create internal reports to satisfy internal stakeholders; they need to start leveraging cross-departmental data to deliver insights and shape the customer experience better. This comes at the back of companies undergoing digital transformation in Asia Pacific, where the first wave of change was migration towards the cloud and expanding their digital capabilities. With the right tools, the next wave will see marketers using both public and owned data to drive measurable outcomes, and enhance customer experience through personalisation. It all sounds great, but where can companies start? Here are some big brands that have successfully managed to embrace data analytics to fuel efficiency and growth


EA - Why You Should Think About The Enterprise Continuum!

We were really thinking hard about whether TOGAF should stay in the Information Technology (IT) space versus the “enterprise” space. Most of us thinking we need to go beyond IT architecture. When we agreed to proceed in the enterprise direction many things emerged as important including thinking about services – not just IT services, but business oriented services. We thought a lot about building blocks. As a matter of fact, if you got a hold of an old version of TOGAF, you would see interesting treatment of building blocks and services, and how they would be used in the Architecture Development Method (ADM). Of course the subject of building blocks generated a need to distinguish between architecture building blocks and solutions building blocks, and their relationship. Additional discussion uncovered the observation that there were common problems across enterprises addressed by different architectures and solutions.


Innovation, Tradition, And Striking the Balance


My son turns eleven today. We are all set to celebrate as we always do – our kids love the traditions that come with birthdays, Christmas, Thanksgiving, college football, and too many other events to mention. The house is decorated exactly the same for every birthday. I’m told they love it that way. There will be a special dinner, as always. All this tradition and consistency got me thinking. My children certainly love new things and surprises: new adventures, trips to unknown places, crazy experiences. And still, for a handful of personal milestones, they seem to want- to need- something familiar and dependable. Certainly, that is to be expected. New experiences bring excitement, anticipation of something unknown, and the possibility of “total awesomeness” (which, I have to imagine, is what the kids are saying nowadays.) Those traditions, the patterns sought out by their own brains, bring them a sense of stability, safety, and comfort.


Why Marketing Needs AI

The largest costs in marketing are human-related, from people to make content at scale to running advertising programs. These costs scale upwards at a disproportionate rate to impact delivered; adding more marketers scales at best linearly, because humans only have 24 hours in a day and do any one task relatively slowly. Compare that with the capabilities of machine learning and artificial intelligence. If I have an analysis problem to solve and sufficient cloud computing infrastructure, instead of having one computer work on the problem, I simply “hire” thousands of temporary computers to instantly complete the job. Once done, those computers move onto other tasks. I could never hire thousands of people in a second and lay them off seconds later – but I can with machines. If all the tasks in marketing were ideally suited for the ways humans work, this solution wouldn’t be much of a solution at all.


Fintech: Too large to ignore, too complex to regulate


Technology-neutral regulation refers to a specific regulatory process under which rules and regulations prevent service providers from preferring one type of technology over another in offering their services, although some experts, such as Professor Matthias Lehmann of the University of Bonn, find the definition of technological neutrality to be ambiguous. While innovation used to be regarded more positively before the 2008 financial crisis, according to Arner, Patrick Armstrong, Senior Risk Analysis Officer on the Innovation and Products Team at the European Securities and Markets Authority (ESMA), pointed out that "regulations are there as a response to the market failure from 10 years ago".  He argued that, in dealing with fintech, regulators act differently, very much depending on the technology involved and the risk that it carries.


Fintech Malaysia Report 2017

Malaysia’s regulators in these few years has taken a open but cautious approach towards regulating fintech. Since the appointment of Tan Sri Muhammad Ibrahim as the new Governor of the Central Bank of Malaysia in 2016 we’ve seen several key reforms and regulations being introduced most notably was the announcement of the Malaysia’s fintech regulatory sandbox. The Fintech Sandbox is open to all fintech companies including those without a presence in Malaysia, however the prequisite is that said company must have a genuinely innovative solution that fills a gap in the market. They are not required to work with a bank but the Bank Negara Malaysia encourages it. Upon being approved to be in the sandbox the fintech companies has 12 months testing period.


Rational Agents for Artificial Intelligence


The path you take will depend upon what are the goals of your AI and how well you understand the complexity and feasibility of various approaches. In this article we will discuss the approach that is considered more feasible and general for scientific development, i.e. study of the design of rational/intelligent agents. ... There are 4 types of agents in general, varying in the level of intelligence or the complexity of the tasks they are able to perform. All the types can improve their performance and generate better actions over time. These can be generalized as learning agents. ... As the agents get complex, so does their internal structure. The way in which they store the internal state changes. By its nature, a simple reflex agent does not need to store a state, but other types do.


Can a blockchain tech revolutionize corporate deposits?

“What’s been happening over the last five years in the banking industry is banks have been reviewing customers and are looking more closely at the profitability of each client,” Aidoo said. “As a result, banks may turn away less profitable clients.” Yet Aidoo believes there is a solution: the Utility Settlement Coin, which relies on blockchain technology. ... The Utility Settlement Coin is a smart contract that is held at a central bank as collateralized cash. It lets banks accept deposits from corporations and turn some of them into settlement coins, which are really balances held at a central bank. Say there are five casinos in a small area and a gambler buy chips with U.S. dollars at a cashier window at one of them. If the casinos were all using Utility Settlement Coin, the gambler could go to any of the casinos with the same chips and they would honor them, letting the gambler exchange them for cash.


6 Industries That Could Be Forever Changed by Blockchain


The future is about to change with blockchain. A blockchain is essentially a continuously growing list of records, called blocks, which are linked and secured using cryptography. The first work on a cryptographically secured chain of blocks was described in 1991 by Stuart Haber and W. Scott Stornetta. While blockchain is still fairly new to most consumers, experts are beginning to understand that banking and payments aren't the only industries that could be affected by blockchain technology. Other industries could also be affected by this new phenomenon in the future. With every paradigm shift, there are winners and losers, and just as the internet disrupted the way we communicate, blockchain will disrupt a number of industries. The world's crypto-currency market is worth more than 100 billion dollars. Startups are already using blockchain to push transparency and trustworthiness within the digital information ecosystem.



Quote for the day:


"You grow up the day you have your first real laugh at yourself." -- Ethel Barrymore