Daily Tech Digest - October 24, 2017

mastercard-blockchain.jpg
The Mastercard blockchain is a permissioned blockchain, which will allow participants to maintain the distributed ledger without sacrificing scalability or performance, Sota explains in the video. ... "Our blockchain technology can be used for clearing in near real-time card payment transactions eliminating consolidation and improving settlement," he said. According to Mastercard, its technology boasts four key differentiators to others in the space, spanning privacy, flexibility, scalability, and the reach of the company's settlement network. Mastercard said its blockchain provides privacy by ensuring that transaction details are shared only amongst the participants of a transaction while maintaining a fully auditable and valid ledger of transactions, but still allowing partners to use the blockchain APIs alongside other Mastercard APIs


IT, OT, IoT: Does Hitachi Have a Dictionary for This Alphabet Soup?

Image: Pixabay
At one end of the spectrum, and most notably in this “industrial reinvested as software” class, is GE. GE, better known for building gas plants, jet engines and wind turbines, is reinventing itself as a software company. Under former CEO Jeff Immelt, and current head of all things digital, Bill Ruh, the company is investing hundreds of millions of dollars to build capability in the software space. GE is applying its Predix software offering both to its own business units but, more importantly, is attempting to become the software provider of choice for a host of third-party industrial organizations. At the other end of the spectrum lie the traditional technology vendors who, despite not having significant industrial experience themselves, have long histories of delivering technologies to industrial operations.


Architecture Patterns to Consider When Designing an Enterprise Data Lake

architecture_patterns_enterprise_data_lake-10
Virtually every enterprise-level organization requires encryption for stored data, if not universally, at least for most classifications of data other than that which is publicly available. All leading cloud providers support encryption on their primary objects store technologies (such as AWS S3) either by default or as an option. Likewise, the technologies used for other storage layers such as derivative data stores for consumption typically offer encryption as well. Encryption key management is also an important consideration, with requirements typically dictated by the enterprise’s overall security controls. Options include keys created and managed by the cloud provider, customer-generated keys managed by the cloud-provider, and keys fully created and managed by the customer on-premises.


Why Tech Giants See Singapore As The Next AI Hub


Singapore-based Marvelstone on Monday dovetailed the announcement by the Chinese conglomerate – owner of the South China Morning Post – by revealing it was setting up an AI hub of its own in the city state, which would incubate 100 start ups every year. It said its hub would be “the world’s biggest” when it opens next year. ... The government also showed it is serious about the country’s AI prospects when it announced the development of a dedicated data science consortium, and pledged Sg$150 million to industry research. In the Lattice80 complex, located in Singapore’s central business district, Ko said he was confident the government would follow through with its pledge to foster the industry. “Firstly, it’s about diversity … other Asian cities like Tokyo are also trying to be AI hubs, but they are more homogenous. Singapore’s advantage is that it is welcoming to all, and there is strong government support,” he said.


The prevalence of AI-powered IoT devices inspires mixed emotions

The easy path forward would be to continue developing connected devices without taking people’s fears into consideration. However, this is both unethical and unadvisable from a practical standpoint. Unsecured devices put multiple parties at risk, from the person using the product to the company pulling data from it. A better approach to the situation lies in analyzing the strengths, weaknesses, opportunities, and threats AI- and IoT-enabled devices offer. This will require addressing such pain points as IoT standards, privacy measures, and security. It could also involve education, job training, and general change management. But whether we’re looking at something as mundane as faster streaming or as grand as smart cities, the internet of things — when bolstered by artificial intelligence — has potential to impact every aspect of our lives.


Three Things Data Scientists Can Do To Help Themselves And Their Organizations

In the brave new world of business analytics fueled by big data, there has been significant discussion about the evolving roles of C-suite executives, including the CEO, CTO, and CIO. That discussion is now expanding to include the CMO plus the new roles of CDO and CDS. I do not have an MBA and I usually don’t undertake risky behavior, such as telling a CEO how to run her or his business. However, it is entirely appropriate for the CMO, CDO, and CDS to step up to the challenges of leading and directing the analytics, big data, and data science efforts of their organization, respectively. It is also appropriate for these execs to stand firm against corporate cultures and naysayers that resist big data analytics projects with these types of remarks: a) “Let’s wait and see how it develops elsewhere”; b) “We have always done big data”; or c) “What’s the ROI? Show me the numbers.”


The cryptoeconomics of scaling blockchains


A key shortcoming of the current generation of blockchain technologies is their limits when it comes to performance and scalability. For instance, the entire Bitcoin network can only handle seven transactions per second, compared with over 2,000 transactions per second on the VISA network and millions of transactions per second handled by any top tier consumer application. That has made it impossible for the current generation of blockchain networks to handle big data applications. Is the poor performance of blockchains an engineering problem? It is not, at least not entirely. The problem is actually inherent to the incentive-driven design of blockchains, known as cryptoeconomics. Incentives in Bitcoin consensus Blockchain is useful because it allows untrusted and non-corporative parties to work together and maintain a system. Let’s look at the example of the Bitcoin network.


Stuck between Design Thinking and Lean Startup? Take a hybrid approach


There are now so many different kinds of innovation: design innovation, business model innovation, digital innovation. And so many ways to organizefor innovation: innovation labs, innovation centers, corporate accelerator programs. More significantly, there has been a growth of two schools of thought in corporate innovation: Design Thinking and Lean Startup. Suddenly corporate innovators feel the need to be trained in both. But many consultancies practice or train in only one. And wherever corporate innovators sit, there is growing pressure to be more entrepreneurial. More agile. To increase speed to market. To be more like that startup accelerator your boss visited. ... The best way to tackle this would be to learn about these new approaches, test them on real innovation projects, and then adapt them so that they’re really practical and work in corporations.


What Are The Security Threats For The Cloud

What Are The Security Threats For The Cloud
Surprisingly, although cloud security is so important seeing the different data breaches we have seen around the globe, over 40% of the IT managers have no plans of purchasing ‘security-as-a-service’ solutions. This raises the question, how well such companies are prepared for a future where cloud becomes more and more important as well as criminals are targeting cloud solution on a wider scale. The security of the data in your cloud is vital for companies. Being hacked can have serious consequences for a company as well as on a personal level, seeing the Target CEO who was fired after a data breach. Once your cloud is hacked, your company has a serious issue, depending on the severity of the hack. Therefore it is wise to be aware of the security issues when dealing with the cloud. This infographic might help to achieve that.


Tech Giants Are Paying Huge Salaries for Scarce A.I. Talent


At the top end are executives with experience managing A.I. projects. In a court filing this year, Google revealed that one of the leaders of its self-driving-car division, Anthony Levandowski, a longtime employee who started with Google in 2007, took home over $120 million in incentives before joining Uber last year through the acquisition of a start-up he had co-founded that drew the two companies into a court fight over intellectual property. Salaries are spiraling so fast that some joke the tech industry needs a National Football League-style salary cap on A.I. specialists. “That would make things easier,” said Christopher Fernandez, one of Microsoft’s hiring managers. “A lot easier.” There are a few catalysts for the huge salaries. The auto industry is competing with Silicon Valley for the same experts who can help build self-driving cars. Most of all, there is a shortage of talent, and the big companies are trying to land as much of it as they can. Solving tough A.I. problems is not like building the flavor-of-the-month smartphone app.



Quote for the day:


"Nothing so conclusively proves a man's ability to lead others as what he does from day to day to lead himself." -- Thomas J. Watson


Daily Tech Digest - October 23, 2017

Internet of things illustration
Companies are starting to cast the net farther afield, taking on graduates from a far wider range of disciplines. Virtusa often looks for people with a background in the arts, says Gabrault, because alongside their analytical skills they are creative and can play a key role in user experience, and make sure a product is actually something people want to interact with. Teamwork is also important. IoT is not about beavering away on solo projects, but involves interaction with other teams, end users and customers. “Candidates need to show that they can empathise with the client,” adds Owen. Helping students become “work-ready” is one of the driving forces behind Fast Track, a programme run by the Future of British Manufacturing. It matches students from some of the UK’s leading universities with companies, to help them develop their next big innovation or connected product.


Demystifying The Dark Science Of Data Analytics

Demystifying data analytics: How to create business value with data
Deeper analytics knowledge can also help IT leaders understand why the approach often seems so mysterious. "Data science, in its best form, is an extremely creative endeavor," Johnston says. "There is not necessarily a need for managers to understand the internals of every analysis, just as owners of a software project need not understand the underlying technological internals." ... Unlike IT, where solutions are often obvious and widely adopted by enterprises worldwide, analytics processes are frequently unique and individualized. "Choosing the best analytical method is sometimes straightforward, sometimes art," Magestro says. "For example, looking for cause-effect relationships in data usually means some kind of regression, and looking for similar characteristics in large customer datasets likely involves clustering algorithms."





Select Your Agile Approach That Fits Your Context


By definition, the team finishes the work at the end of that time. The PO decides if any unfinished work moves to the next iteration or farther down the product roadmap. If your team uses iterations as in Scrum, the iteration starts with the ranked backlog and ends with the demo and retrospective. If your team uses flow, you can demo and retrospect at any time. To be fair, iteration-based agile approaches don’t prevent you from demoing or retrospecting at any time. ... Teams might have trouble finishing stories in a timebox or iteration. There can be any number of reasons for their trouble. Here are three common problems I’ve seen: the stories are too large; the people are multitasking on several stories or worse, projects; and the team is not working as a team to finish stories. If the team can’t finish because of multitasking, a cadence might make that even worse. However, visualizing their work might make a difference.



APIs Need to Be Released, Too!

Would it come as a surprise to hear that at the core of each and every one of these priorities are APIs and DevOps? So, just what is an API? API stands for Application Programming Interface and it’s a highly common software development term ­– an initialism you’re bound to have come across. In some form or another, development has always relied on interfaces. Without going too deep, APIs are primarily concerned with enabling communications between ‘private’ and ‘public’ interfaces. Private interfaces are used internally between individual developers and development teams. These aren’t accessible to third parties and can be changed as often as required. This is in stark contrast to public interfaces, which are exposed to third parties – be they internal or outside the company – and shouldn’t change often as other services using these interfaces may break or stop functioning.


Quantum physics boosts artificial intelligence methods


A popular computing technique for classifying data is the neural network method, known for its efficiency in extracting obscure patterns within a data set. The patterns identified by neural networks are difficult to interpret, as the classification process does not reveal how they were discovered. Techniques that lead to better interpretability are often more error-prone and less efficient. “Some people in high-energy physics are getting ahead of themselves about neural nets, but neural nets aren’t easily interpretable to a physicist,” said USC’s physics graduate student Joshua Job, co-author of the paper and guest student at Caltech. The new quantum program is “a simple machine learning model that achieves a result comparable to more complicated models without losing robustness or interpretability,” Job said.


How Close Are You Really?


The network of links between individuals—their social network—has long fascinated social scientists. These networks are neither random nor entirely ordered. Instead, they occupy a middle ground in which people are strongly linked to a few individuals they know well, with weaker links to a larger group of friends and coworkers plus extremely weak links to a wide range of casual acquaintances.  Social scientists measure the strength of these links using a variety of indicators, such as how often a person calls another, whether that call is reciprocated, the time the two people spend speaking, and so on. But these indicators are often difficult and time-consuming to measure. So network theorists would dearly love to have some way of measuring the strength of ties from the structure of the network itself.


 The Future of Enigma and Data


In practice, to build a data marketplace, the Enigma protocol needs to implement the infrastructure for a decentralized database, with storage and computational abilities that far exceed those that blockchains offer. While all blockchains are, in a manner of speaking, protocols for decentralized computing and data storage, their poor scalability and lack of privacy features limit potential use-cases. We need a second-layer network that can handle more data, faster, and can provide better privacy features — and that’s where the Enigma protocol comes in. Our protocol is based on the ideas presented in the 2015 Enigma whitepaper, as well as in our subsequent work (paper, thesis). It aspires to complement a blockchain (of any kind) with an off-chain data network (essentially — a single, always-on decentralized database), in much the same way that payment networks (e.g., Raiden) offer better financial transactions scalability.



Could Your Reactive Cyber Security Approach Put You Out of Business?

reactive-cyber-security-2
One scenario could involve your organisation becoming the victim of ransomware where an attacker hijacks your data and demands compensation for it. Without paying up, your operations come to a screeching halt, and your revenue plummets overnight. Another would be having sensitive customer or employee information fall into the wrong hands. This can lead to everything from identify theft to corporate espionage. Even basic information, like email addresses, phone numbers and billing addresses can be of significant value to cyber criminals and open a can of worms. You also have to consider the level of disruption that comes along with an attack. Not only does downtime cost your business serious money, it can tarnish your brand reputation, and many customers may end up turning to competitors.


Digital brains are as error-prone as humans

Imagine a future where you are regularly stopped and searched by the police, based simply on bad information fed into a computer. That is the fear of one authority on the subject, who is concerned that human biases and errors are being programmed into machine learning
The algorithms that make up these neural networks can unintentionally boost these biases, giving them undue importance in their decision making. Writing in the WSJ, Professor Crawford said: 'These systems “learn” from social data that reflects human history, with all its biases and prejudices intact.  'Algorithms can unintentionally boost those biases, as many computer scientists have shown. 'It’s a minor issue when it comes to targeted Instagram advertising but a far more serious one if AI is deciding who gets a job, what political news you read or who gets out of jail. 'Only by developing a deeper understanding of AI systems as they act in the world can we ensure that this new infrastructure never turns toxic.' Research has already demonstrated that AI systems trained using such data can be flawed.


The Role of Data in the Financial Sector


What makes the financial sector even more interesting from a big data standpoint is the constant stream of new regulations and reporting standards that bring new data sources and more complex metrics into financial systems. ... The ForEx markets, as mentioned earlier, trade 24 hours per day, from morning in Sydney to evening in New York, except for a small window during the weekend. Additionally, algorithmic trading has been used in the financial markets for a long time in one form or another. The NYSE introduced its Designated Order Turnaround (DOT) system in the early 1970s for routing orders to trading desks, where the orders were executed manually. Now, algorithmic trading systems break very large orders into smaller pieces that are executed automatically based on time, price, and volume, optimized for market parameters.





Quote for the day:

"Defragmenting data silos is key for accelerating research."  -- Joerg Kurt Wegner


Daily Tech Digest - October 21, 2017

HPE bets on hybrid IT to combat public cloud

In a note covering the meeting, financial analyst Berenberg wrote that HPE’s strategy remains focused on three key areas. The first is simplifying hybrid IT using its datacentre technologies, systems software, and private and public cloud partnerships. The second area of focus is to support the so-called intelligent edge. This encompasses its offerings from Aruba in campus and branch networking, and the industrial internet of things (IoT) with products such as Edgeline and the Universal IoT software platform, said Berenberg. The third area of focus is HPE’s advisory, professional and operational services, which Berenberg said will include consumption-based pricing. The analyst said HPE’s decision to stop selling custom-designed commodity servers to the tier-one service providers, while continuing to sell them higher-margin products, was a risky business strategy.


GDPR Requirements for US Companies


An organization may decide not to do business with EU citizens to avoid having to comply with GDPR, but even that decision must be implemented correctly. If you maintain a website that uses cookies, and it can be accessed by EU citizens, GDPR applies. GDPR also applies to organizations of all sizes. It doesn’t matter if you are a small one-person practice or a large organization with thousands of employees. If you collect or process data on EU citizens, GDPR compliance is not optional. GDPR replaces the EU Data Protection Act of 1998, which placed responsibility only on the data controller, not processors of data. If you processed data for another company (the controller) it would be that company that had to comply with past regulations. GDPR applies to both processors and controllers – Both parties are now responsible for protecting the privacy rights of EU citizens.


Data: Lifeblood of the Internet of Things

As IoT matures and we gain more confidence in the technology, we will increasingly use it on more critical applications – like self-driving cars— where errors could lead to serious injury or disruption. This requires us to ensure critical and hypercritical data is prioritised – and in turn it will drive an enormous shift in how systems capture, manage, store, secure, and process information. Analytics, for instance, will increasingly need to happen in real-time and superior analytics will become a competitive advantage.  IDC estimates that by 2025, about 20 percent of all data will be critical—meaning necessary for the continuity of daily life--and nearly 10 percent of that will be hypercritical, or directly impacting the health and wellbeing of users. Not all data is equally important, but the amount of hypercritical data generated by IoT is accelerating dramatically.


GetStream.io: Why We Switched from Python to Go


Go is extremely fast. The performance is similar to that of Java or C++. For our use case, Go is typically 30 times faster than Python. ... For many applications, the programming language is simply the glue between the app and the database. The performance of the language itself usually doesn’t matter much. Stream, however, is an API provider powering the feed infrastructure for 500 companies and more than 200 million end users. We’ve been optimizing Cassandra, PostgreSQL, Redis, etc. for years, but eventually, you reach the limits of the language you’re using. Python is a great language but its performance is pretty sluggish for use cases such as serialization / deserialization, ranking and aggregation. We frequently ran into performance issues where Cassandra would take 1ms to retrieve the data and Python would spend the next 10ms turning it into objects.


Survey shows most workers misunderstand cybersecurity

The report concluded by advocating employee education programs in order to create a culture of cybersecurity. However, keeping users informed about the current threat landscape isn't the difficult part of establishing appropriate cybersecurity. That award goes to keeping them focused on their behavior so they can understand how to make the right choices. As a system administrator I'm a big proponent of automation wherever possible and this includes patching/securing systems and devices. Proper automation can apply updates and restart systems automatically without establishing a reliance on the end users, run in the background in the form of anti-spam filters and antimalware software, or enforce policies intended to protect devices such as by mandating passwords or utilizing encryption.


How AI can help you stay ahead of cybersecurity threats

artificial intelligence face on top of computer grid
Barclays Africa is beginning to use AI and machine learning to both detect cybersecurity threats and respond to them. “There are powerful tools available, but one must know how to incorporate them into the broader cybersecurity strategy,” says Kirsten Davies, group CSO at Barclays Africa. ... AI and machine learning also lets her deploy her people for the most valuable human-led tasks. “There is an enormous shortage of the critical skills that we need globally,” she says. “We've been aware of that coming for quite some time, and boy, is it ever upon us right now. We cannot continue to do things in a manual way.” ... San Jose-based engineering services company Cadence Design Systems, Inc., continually monitors threats to defend its intellectual property. Between 250 and 500 gigabits of security-related data flows in daily from more than 30,000 endpoint devices and 8,200 users


Data Science, IT And Your Digital Transformation

As data science becomes more commonplace in enterprise environments, businesses are gaining a better understanding of how to deliver the experiences customers crave. Even so, when my company, DataScience.com, commissioned Forrester to survey more than 200 businesses last year, it found that only 22% were leveraging big data well enough to get ahead of their competition. Standing in the way of these companies on the road to a digital-first approach are challenges inherent to large-scale data management, governance and access. If handled incorrectly, data management and infrastructure problems can cripple a digital transformation; however, a great data management strategy will clear the way for data science success. Imagine a scenario in which your data science team predicts, with a high level of accuracy, how much money your customers are likely to spend with your business in the next three months and then delivers that information to you in the dashboard of a tool you already use.


Apache Kafka and the four challenges of production machine learning systems

Hopscotch
The nature of machine learning and its core difference from traditional analytics is that it allows the automation of decision-making. Traditional reporting and analytics is generally an input for a human who would ultimately make and carry out the resulting decision manually. Machine learning is aimed at automatically producing an optimal decision. The good news is that by taking the human out of the loop, many more decisions can be made: instead of making one global decision (that is often all humans have time for), automated decision-making allows making decisions continuously and dynamically in a personalized way as part of every customer experience on things that are far too messy for traditional, manually specified business rules. The challenge this presents, though, is that it generally demands directly integrating sophisticated prediction algorithms into production software systems.


Machine Learning And The Future of Finance

Some brokerage and banking functions can be streamlined as investment advisers are cued to sales opportunities by software that considers customers’ past trades and investment preferences. Credit verification already uses a high degree of automation and could be made even more accurate by considering more complex patterns of the borrower and adjusting the weights to these factors based on changes seen in customer behaviour. All these software tools are used within the world of human activities that have consequences for individuals and whose rules are set by humans. For the immediate future, humans will need to give oversight to computer activities and continue to handle the exceptional situations. This is especially true since the reasons a program makes a decision are very different from the heuristics used by humans.


5 Pitfalls Of Self-Service BI

To get value out of BI tools, business units need to feed them data. In general, this means business units stand up and manage their own data marts — subsets of data warehouses that contain data specific to a business line. Because individual business units are typically responsible for all the hardware, software, and data that comprise their data marts in a self-service BI environment, those business units will inevitably create their own definitions and metrics. That's not such a big problem if that business unit is the only user of the data, but it becomes a large problem when trying to compare reports from different business units. "You've gone from a central model where there was tight control of the business metrics, and you've put that in the hands of the masses and it creates conflicting definitions," says Mariani. Mariani notes that during his tenure at Yahoo, the company's business units had myriad definitions of ad impressions and visits.



Quote for the day:


"When human judgment and big data intersect there are some funny things that happen." -- Nate Silver


Daily Tech Digest - October 16, 2017

Blockchain can fix the sorry state of the real estate industry
A number of blockchain startups are working on tokenizing real estate ownership to overcome these challenges and open real estate investment to more people. An example is BitProperty, a platform that enables property owners to register their property on the blockchain and issue tokens, digital currencies that represent a share of their property. When a person wants to invest in the property, they can purchase any number of its corresponding tokens on BitProperty. Contractors and construction companies can use the BitProperty to raise funds for their projects by launching initial coin offerings (ICO). Anyone who wants to invest in the project can purchase the project’s tokens. In return, they’ll have proportional share of the value and revenue of the finished project in the future.


For a time, Linux OS computers were expected to become the dominant third player in the PC market, Huang said. For example, many thought Linux would be the ideal OS for netbooks, which peaked in popularity in the late 2000s/early 2010s. But Microsoft captured that market by “bottom-ending” Windows to run on the low-powered portables. There’s still talk that Linux could emerge as the dominant OS for thin clients. But IDC’s data doesn’t support that belief. Linux comprised a 3% share of global PC shipments in 2013, but it’s held steady at a mere 1% since 2015 and is expected to stay at 1% through 2021. By comparison, Chrome OS has risen from 1% of the market in 2013 to its current position of 5.5% in 2017, and IDC expects it to reach 8% by 2021.



Cybersecurity: into the data breach
The vulnerabilities stakeholders face include cyber security, data privacy, data breaches, and payments fraud. The utmost vigilance is required to protect organisations against cyber attacks and all stakeholders, including regulators, must be more proactive regarding cybersecurity, with ownership of the issue taken to prevent attacks. In the new payments ecosystem, third-party developers can directly interact with a partner banks’ customers, raising questions about data privacy and security. In an increasingly networked ecosystem, identifying the source of attack will be a challenge. Verizon’s 2017 Data Breach Investigations Report found that security incidents and data breaches affect both large and small financial organisations almost equally. However, the security of larger banks is difficult to compromise as they invest more in cyber security solutions. Smaller banks, which do not have the same access to resources, are more prone to cyberattacks.


A soon-to-be published study shows how the traditional corporate human resources operation actually hampers cybersecurity hiring against a backdrop of the industry's well-documented talent gap. The Jane Bond Project report, commissioned by security talent recruiting firm CyberSN, found that in addition to the lack of available talent for those positions, respondents say their HR generalists are not equipped to recruit and hire cybersecurity talent, and that flawed salary data complicates their ability to issue the best job offers. More than 80% of the 83 cybersecurity positions studied in the report ended up with compensation offers higher than the salary caps stated in the original job descriptions. Half of the 52 organizations participating in the study say they had to up the compensation offers to seal the deal. The positions in the study include security engineers, product sales engineers, incident response analysts, SOC analysts, and product security experts.



REST-API-LEGACY-API
Obviously, no company is going to replace all of their hardware overnight, as it would require considerable expense, implementation and architecture challenges that, until resolved, could impact company operations. In addition, there would be plenty of non- technical issues, like employees knowing device X and networkOSY like the palm of their hands and not looking forward to the time it might take to learn new technology and processes. When a company decides to transform to a software defined networking infrastructure, they may not get support from their existing Network hardware vendor, which might beenjoying hefty margins in network hardware sales, not thrilled to push a technology that will make their expensive boxes replaceable for cheap vendor agnostic white boxes.


According to Badman, Extreme's Automated Campus initiative shows great promise due, in part, to 802.1aq shortest path bridging, which supplants routing protocols such as Border Gateway Protocol (BGP), MPLS and Open Shortest Path First (OSPF), thereby reducing complexity. The new network fabric also includes hypersegmentation to contain security breaches, APIs to increase interoperability, and user and device policies that drive automated network changes in conjunction with analytics and changes on the edge. Badman said he views Avaya as one of the leaders of software-defined networking fabrics, adding that Extreme has succeeded in integrating Avaya fabrics since it acquired the vendor. "I'm of the opinion that some vendors are trying to figure out how to proceed with network-wide fabric methods, while painting beta-grade efforts up with glitz and catchy slogans. This just isn't the case for Extreme," he wrote.


Will the Internet of Things rewrite the rules on cyber security?
Despite having capable teams of programmers and rigorous testing procedures, many companies – be they retailers, manufacturers, or service providers – still have a hard time seeing the potential vulnerabilities in their own systems. “There are a lot of companies who think ‘this will never happen’ and then they come back to us six months later saying ‘it happened’,” says Kupev. The challenge, he explains, is being able to look at things from a different point of view. “Often a client’s view of things can be quite narrow because they’re used to looking at things from the same perspective,” he adds. “Our job is to help them look at matters from a different angle and uncover vulnerabilities they would have otherwise missed.” To illustrate his point, Kupev tells the story of an engine maker that invested heavily in ensuring a device’s “regular” communications systems are secure.


The ability to decrypt packets can be used to decrypt TCP SYN packets. This allows an adversary to obtain the TCP sequence numbers of a connection, and hijack TCP connections. As a result, even though WPA2 is used, the adversary can now perform one of the most common attacks against open Wi-Fi networks: injecting malicious data into unencrypted HTTP connections. For example, an attacker can abuse this to inject ransomware or malware into websites that the victim is visiting. If the victim uses either the WPA-TKIP or GCMP encryption protocol, instead of AES-CCMP, the impact is especially catastrophic.Against these encryption protocols, nonce reuse enables an adversary to not only decrypt, but also to forge and inject packets. Moreover, because GCMP uses the same authentication key in both communication directions, and this key can be recovered if nonces are reused, it is especially affected.


clicks pageviews traffic denial of service ddos attack 100613842 orig
Metered DDoS pricing used to be more common, said Theresa Abbamondi, director of product management for Arbor Cloud and Services at Arbor Networks, Inc. That created a risk for customers, she said. Arbor has been pricing based on clean traffic when it launched its service four years ago, one of the first vendors to do so. "Most of the purpose-build anti-DDoS vendors quickly moved to this type of clean traffic pricing model, and it became the standard in the high end of the market," she said. "Among vendors like Cloudflare, who sell DDoS as an add-on service to a customer base more interested in the vendor’s core offerings, it’s still common today to see vendors limiting the total bandwidth of traffic they will scrub, blackholing traffic that exceeds that threshold, or hitting the customer with exorbitant, hidden fees," said Abbamondi.


New derived credential technology eliminates the need for a physical card by placing verified identity credentials directly and securely onto the mobile device, much as mobile-pay systems do away with the need to make payments using a plastic credit card. This technology offers the added benefits of making identity verification more convenient, and preventing unauthorized logins. But derived credentials and authentication tools such as biometrics offer only a one-time, “snapshot” form of user verification. Once the user has passed the initial test and gained access, the device and everything on it become fully available for viewing and use. Behavioral analytics promises to change this paradigm.  ... tools designed to capture how a device is used can provide the equivalent of a continuously-authenticating security “video,” to detect interlopers, transaction by transaction.



Quote for the day:



"Anger is the feeling that makes your mouth work faster than your mind." -- Evan Esar


Daily Tech Digest - October 15, 2017

Data governance stumble

Data governance involves data quality, ownership and security, metadata, and analytics processes. In most organizations, the word “governance” tends to throw off staff, who can become confused with what data governance entails in the organization and what their specific role is. In order to clear up the role of data governance in the business, it should be defined more in terms of data quality and how higher quality data can advance the efficiency of the business. High data quality should be the fundamental aim for any data governance campaign, and it should be the key area of focus. In fact, research by Gartner showed that poor data quality cost organizations an average of $8 million a year.
Challenges to the proper implementation of data governance Many businesses fall prey to spending too much time on defining a data governance model, such that they end up hindering their organization from becoming data driven.


How to shape your customer experience with big data


It’s clear that the volume of data is increasing, and shows no signs of slowing down. As such, brands will have to move beyond using data just to create internal reports to satisfy internal stakeholders; they need to start leveraging cross-departmental data to deliver insights and shape the customer experience better. This comes at the back of companies undergoing digital transformation in Asia Pacific, where the first wave of change was migration towards the cloud and expanding their digital capabilities. With the right tools, the next wave will see marketers using both public and owned data to drive measurable outcomes, and enhance customer experience through personalisation. It all sounds great, but where can companies start? Here are some big brands that have successfully managed to embrace data analytics to fuel efficiency and growth


EA - Why You Should Think About The Enterprise Continuum!

We were really thinking hard about whether TOGAF should stay in the Information Technology (IT) space versus the “enterprise” space. Most of us thinking we need to go beyond IT architecture. When we agreed to proceed in the enterprise direction many things emerged as important including thinking about services – not just IT services, but business oriented services. We thought a lot about building blocks. As a matter of fact, if you got a hold of an old version of TOGAF, you would see interesting treatment of building blocks and services, and how they would be used in the Architecture Development Method (ADM). Of course the subject of building blocks generated a need to distinguish between architecture building blocks and solutions building blocks, and their relationship. Additional discussion uncovered the observation that there were common problems across enterprises addressed by different architectures and solutions.


Innovation, Tradition, And Striking the Balance


My son turns eleven today. We are all set to celebrate as we always do – our kids love the traditions that come with birthdays, Christmas, Thanksgiving, college football, and too many other events to mention. The house is decorated exactly the same for every birthday. I’m told they love it that way. There will be a special dinner, as always. All this tradition and consistency got me thinking. My children certainly love new things and surprises: new adventures, trips to unknown places, crazy experiences. And still, for a handful of personal milestones, they seem to want- to need- something familiar and dependable. Certainly, that is to be expected. New experiences bring excitement, anticipation of something unknown, and the possibility of “total awesomeness” (which, I have to imagine, is what the kids are saying nowadays.) Those traditions, the patterns sought out by their own brains, bring them a sense of stability, safety, and comfort.


Why Marketing Needs AI

The largest costs in marketing are human-related, from people to make content at scale to running advertising programs. These costs scale upwards at a disproportionate rate to impact delivered; adding more marketers scales at best linearly, because humans only have 24 hours in a day and do any one task relatively slowly. Compare that with the capabilities of machine learning and artificial intelligence. If I have an analysis problem to solve and sufficient cloud computing infrastructure, instead of having one computer work on the problem, I simply “hire” thousands of temporary computers to instantly complete the job. Once done, those computers move onto other tasks. I could never hire thousands of people in a second and lay them off seconds later – but I can with machines. If all the tasks in marketing were ideally suited for the ways humans work, this solution wouldn’t be much of a solution at all.


Fintech: Too large to ignore, too complex to regulate


Technology-neutral regulation refers to a specific regulatory process under which rules and regulations prevent service providers from preferring one type of technology over another in offering their services, although some experts, such as Professor Matthias Lehmann of the University of Bonn, find the definition of technological neutrality to be ambiguous. While innovation used to be regarded more positively before the 2008 financial crisis, according to Arner, Patrick Armstrong, Senior Risk Analysis Officer on the Innovation and Products Team at the European Securities and Markets Authority (ESMA), pointed out that "regulations are there as a response to the market failure from 10 years ago".  He argued that, in dealing with fintech, regulators act differently, very much depending on the technology involved and the risk that it carries.


Fintech Malaysia Report 2017

Malaysia’s regulators in these few years has taken a open but cautious approach towards regulating fintech. Since the appointment of Tan Sri Muhammad Ibrahim as the new Governor of the Central Bank of Malaysia in 2016 we’ve seen several key reforms and regulations being introduced most notably was the announcement of the Malaysia’s fintech regulatory sandbox. The Fintech Sandbox is open to all fintech companies including those without a presence in Malaysia, however the prequisite is that said company must have a genuinely innovative solution that fills a gap in the market. They are not required to work with a bank but the Bank Negara Malaysia encourages it. Upon being approved to be in the sandbox the fintech companies has 12 months testing period.


Rational Agents for Artificial Intelligence


The path you take will depend upon what are the goals of your AI and how well you understand the complexity and feasibility of various approaches. In this article we will discuss the approach that is considered more feasible and general for scientific development, i.e. study of the design of rational/intelligent agents. ... There are 4 types of agents in general, varying in the level of intelligence or the complexity of the tasks they are able to perform. All the types can improve their performance and generate better actions over time. These can be generalized as learning agents. ... As the agents get complex, so does their internal structure. The way in which they store the internal state changes. By its nature, a simple reflex agent does not need to store a state, but other types do.


Can a blockchain tech revolutionize corporate deposits?

“What’s been happening over the last five years in the banking industry is banks have been reviewing customers and are looking more closely at the profitability of each client,” Aidoo said. “As a result, banks may turn away less profitable clients.” Yet Aidoo believes there is a solution: the Utility Settlement Coin, which relies on blockchain technology. ... The Utility Settlement Coin is a smart contract that is held at a central bank as collateralized cash. It lets banks accept deposits from corporations and turn some of them into settlement coins, which are really balances held at a central bank. Say there are five casinos in a small area and a gambler buy chips with U.S. dollars at a cashier window at one of them. If the casinos were all using Utility Settlement Coin, the gambler could go to any of the casinos with the same chips and they would honor them, letting the gambler exchange them for cash.


6 Industries That Could Be Forever Changed by Blockchain


The future is about to change with blockchain. A blockchain is essentially a continuously growing list of records, called blocks, which are linked and secured using cryptography. The first work on a cryptographically secured chain of blocks was described in 1991 by Stuart Haber and W. Scott Stornetta. While blockchain is still fairly new to most consumers, experts are beginning to understand that banking and payments aren't the only industries that could be affected by blockchain technology. Other industries could also be affected by this new phenomenon in the future. With every paradigm shift, there are winners and losers, and just as the internet disrupted the way we communicate, blockchain will disrupt a number of industries. The world's crypto-currency market is worth more than 100 billion dollars. Startups are already using blockchain to push transparency and trustworthiness within the digital information ecosystem.



Quote for the day:


"You grow up the day you have your first real laugh at yourself." -- Ethel Barrymore


Daily Tech Digest - October 14, 2017

Alibaba Aims to “Master the Laws” of AI and Put Virtual Helpers Everywhere

Jack Ma, Alibaba’s executive chairman, announced his decision to establish the Alibaba DAMO Academy (DAMO stands for Discovery, Adventure, Momentum, and Outlook) on the first day of the company’s 2017 Computing Conference, which opened on Wednesday. Ma said the academy will do research aimed at “solving problems” related to the Internet of things, fintech, quantum computing, and AI. It will open seven research labs in China, the U.S., Russia, Israel, and Singapore.  Chinese tech companies are increasingly looking to invest in cutting-edge research, especially artificial intelligence. Alibaba’s future has never been so closely intertwined with original research. The company already has more than 25,000 engineers working on applying AI advances to consumer products and cloud computing services.


Reframing growth strategy in a digital economy


Traditional strategic planning is important as a means of understanding the world of today. However, if you aspire to a strategy that will enable your company to achieve disproportionate growth and create competitive advantage, you need to push beyond pure analysis. What if the razor industry, dominated by giants Gillette and Schick, had looked beyond known competitors to anticipate the value in a direct-to-consumer subscription service? Would the e-commerce razor delivery company Dollar Shave Club have had such a meteoric rise? And would powerhouse Unilever, which acquired the startup in 2016, have expanded as meaningfully into the shaving business? Successful digital strategy requires a blend of deductive analysis and the type of inductive reasoning that powers the creative leaps that anticipate and open fundamentally new markets.


How Python rose to the top of the data science world

There are many interesting libraries being developed for Python. As a data scientist or machine learning practitioner, I’d be tempted to highlight the well-maintained tools from Python core scientific stack. For example, NumPy and SciPy are efficient libraries for working with data arrays and scientific computing. When it comes to serious data wrangling, I use the versatile Pandas package. Pandas is an open source library that provides fast and simplified data manipulation and data analysis tools for Python programming language. It focusses on providing realistic and high-end data analysis in Python. I’d also recommend Matplotlib for data visualization, and Seaborn for additional plotting capabilities and more specialized plots. And Scikit-learn is a great tool for general machine learning, which provides efficient tools for data mining and analysis.



The elastic edge of the new age network

The elastic edge of the new age network
Corporate boundaries now extend far beyond the four walls of the enterprise, and are becoming more fluid and elastic every day. They now incorporate mobile workers, vehicles, pop-up and temporary networks, kiosks, cameras and sensors, to name just a few. Organisations need a WAN with an Elastic Edge – one that effortlessly expands, contracts and stretches to connect any new endpoint, wherever that may be. The traditional branch-centric WAN was never designed to cope with these demands. Highly complex and inflexible, a traditional WAN often inhibits business agility. Increasingly, organisations are looking to a Network-as-a-Service (NaaS) model, where connections can be spun up and down, whenever and wherever they are needed. In this model, organisations can move away from a traditional build-your-own, fixed location network, to a model that can be designed to individual requirements and billed on a ‘pay for what you use’ basis.


5 Misconceptions about Shadow IT

While you might think traditional firewalls are inspecting traffic thoroughly, legacy security tools don't offer the kind of visibility and control organizations need when sending data back and forth to the cloud. For example, cloud resources are accessible by any user, anywhere. An on-premises firewall cannot register that exchange of information. What's needed are advanced security tools, like those available in a Next-Generation Security Platform. These have the ability to inspect every data packet coming into and out of your virtual environment, and to apply security and access policies consistently across physical and cloud resources. Products like App-ID from Palo Alto Networks identify all traffic and apps, so the IT team enjoys end-to-end visibility. In the Tech Pro Research survey, only 47% said they use next-generation firewalls for SaaS access



How Will AI in FinTech Benefit Consumers?


Areas in the financial industry are dependent on a set of complex rules. Humans have to manually review hoards of data. Human error in this process in inevitable. In some cases, the false positive rate of risk detection – the rate that humans accidentally detect a case of fraud – is as high as 60%. As financial services become more complex, and more people have access to these services, the amount of data banks are dealing with is increasing. The issue of human error will only become more critical. This is why smart systems can be so useful for consumers. They can increase the accuracy with which fraud is detected, and ensure that cases of fraud that are currently being missed are now captured. In the credit card space, for example, MasterCard traces card usage, and endpoint access.


API-Driven Development With OpenAPI And Swagger

Microservices and public APIs are grown from the roots of service-oriented architecture (SOA) and software-as-a-service (SaaS). Although SOA has been a trend for many years, widespread adoption has been hamstrung by SOA's complexity and overhead. The industry has settled on RESTful APIs as the de facto standard, providing just enough structure and convention with more real-world flexibility. With REST as the backdrop, we can create formal API definitions that retain human readability. Developers create tooling around those definitions. In general, REST is a convention for mapping resources to HTTP paths and their associated actions. You've likely seen these as HTTP GET and POST methods. What's key is to use HTTP itself as the standard, and layer conventional mappings on top of that for predictability.


At the intersection of technology and private equity


Private equity firms are attracted to cloud-based SaaS delivery models, which offer recurring revenue streams via continuing renewals of an existing customer base. Despite the lack of hard assets to lend against, the predictability of subscription-based revenue models offer private equity firms a reason to invest, to hold those investments for longer periods of time and provide a cushion from inflated valuations. All of which is not to say that there’s not the threat of fracture to this new private equity-tech partnership. Naysayers argue that funds are paying an egregiously unhealthy EBITDA multiple, and, in doing so, are creating a new bubble. In addition, for the private equity-tech relationship to work in the long-term, funds must be willing to assert unusual control over their investment targets, continue to invest in operationally sound playbooks and must help their portfolio companies nail the subscription model, which requires higher upfront costs.


Using Machine Learning To Multiply Your Digital Marketing ROI

Finding the right marketing messaging for all customer segments is a highly challenging and highly rewarding endeavor. Machine learning pilot programs are providing leading banks with a competitive edge while helping marketers better understand what types of marketing messages are preferred with various audiences. The latest digital advertising platforms and c0ntent management tools are making machine learning accessible and affordable to non-technical marketing teams in a variety of industries. With the adoption of these advanced marketing technologies, we are moving into a world in which the most visible marketing asset to digital marketers, the homepage marquee banner, will deliver more personalized messaging, better marketing results, and an overall better experience to all prospects and customers.


IIA research identifies barriers to business intelligence and analytics adoption

Want to achieve data-driven excellence? It's all about perspective.
While data preparation capabilities are critical to BI and advanced analytics adoption, the “softer” areas of culture, leadership and skills are also key. Even organizations that are strong in these areas say that they see significant challenges with things like innovation, creativity and leadership. As the report reveals, weak adopters of both BI and advanced analytics should see opportunities for executive support. Without a strong vision and buy-in at the executive level, resulting initiatives will naturally fail or underperform. To improve adoption, leaders need to openly demonstrate and quantify the value or metrics of success from these initiatives. As organizations pursue the path from BI to an advanced analytics continuum, the skills and competencies required from data scientists are broader than those of BI/reporting staff or business analysts.




Quote for the day:


"You don't have to know how you are going to accomplish your goals, you just have to know that you will." -- Mike Basevic


Daily Tech Digest - October 13, 2017

Digital banking priority: Make it personal

“We’re seeing a pretty big demand from banks for acquiring new data sources, investing in data and analytics tools and data-related services,” said Nilesh Vaidya, senior vice president at the technology and consulting firm Capgemini. “They’re trying to know their customers better, better identify customer segments and offer more customized products.” Vaidya said that the industry is still in the early stages of this evolution and that much of it is being done by top-tier institutions with larger IT budgets and resources — and it will take time before customers get an Amazon or Netflix-type experience from banks. “There’s a long way to go, but it is something that’s being driven not only [by technology people] but marketing departments and others that want improvements in how they target customers,” he said.


15 Essential Project Management Tools

15 essential project management tools
Top-level project managers are in high demand, thanks to the high-level leadership, knowledge and capabilities they bring to bear on vital business projects. But having the right set of tools is also essential to project management success. Project management tools and templates not only increase team productivity and effectiveness but also prepare the organization for changes brought about by high-impact projects. To perform at their best, project managers need to make the most of tools aimed at business intelligence and analytics, business requirements, change management and project management, as well as a wide array of forms and templates. Here we have compiled the ultimate project manager’s toolkit to help you plan, execute, monitor and successfully polish off your next high-impact project.


Here's Google's biggest secret to not failing at security

Yes, security tends to be viewed as a mundane and necessary evil, but in our world where everything connected to the internet can be hacked, it's suddenly sexy to be able to deliver real security. To better understand BeyondCorp and its implications, I sat down with Sam Srinivas, product management director in Google's Cloud Security and Privacy team. Srinivas came to Google from Juniper Networks where he was chief technologist in the Security Business Unit. He is also president of the security industry's FIDO Alliance, which is working on open standards for strong authentication. ... The fundamental idea is that access control should be identity and application-centric, not network-centric. The current model that depends on a remote access VPN connection to access applications give an all-or-none type of access that doesn't fit with the way organizations work today.


Hacking Is Inevitable So It's Time To Assume Our Data Will Be Stolen


As Telang sees it, a determined hacker is probably going to succeed, yet there’s far too little focus on limiting the damage. Credit freezes could be automatic, and wherever possible data could be aggregated to protect individual identities and private information. The types of fraud-protection services that Equifax sells to customers could be made available to victims as a default. Government intervention may be necessary, as consumers are vulnerable to the credit raters’ mistakes but have little choice but to accept their role in finance. Consumers aren’t really customers for Equifax—the company makes money from banks and credit card companies that buy data from it. US senator Elizabeth Warren has said she wants to see the consumer credit rating industry—which is more lightly regulated than banks and credit card companies


Big Data: Out of the Server Room and Into the World

Sensing a lucrative business emerging, most of the major technology companies have rushed in to create and refine new big data tools to satisfy business needs. Microsoft's Azure platform, for example, now offers a cloud-based service that aims to unify big data tools and applications for their customers. It includes tools to discover and classify data from a wide variety of data collection systems. This approach creates a data catalog, which is independent of data storage location and provides searchable, centralized access to all available business data. The end-user can then utilize the data they find in their own business application, as well as contribute new information to the set. Microsoft's hardly alone in the space, having already been joined by industry heavyweights including Oracle, I.B.M., Amazon, and SAP.


Cybersecurity Strategy, Risk Management and List Making

A person writing in a notebook.
Frameworks are becoming the strategic tools of choice to assess risk, prioritize threats, secure investment and communicate progress for the most pressing security initiatives. They provide assessment mechanisms that enable organizations to determine their current cybersecurity capabilities, set individual goals for a target state, and establish a cybersecurity strategy for improving and maintaining security programs. Frameworks help you understand the maturity of your security activities and can adapt over time to meet the maturity level of the threats you face and the security capabilities you employ. There are various security frameworks that look at different types of needs, but one of the most popular is the National Institute of Science and Technology’s (NIST) Framework for Improving Critical Infrastructure Cybersecurity


Awareness training is key to reducing security risk

This also needs to be part of a broader top-down effort starting with senior management. Awareness training should be incorporated across all organizations and not just limited to governance, threat detection and incident response plans. The campaign should involve more than serving up a dry set of rules, divorced from the broader business reality. If done the right way, employees will come away with a keen understanding how their cyber behavior can impact the overall business. According to the Global Cyber Security Capacity Centre, this hinges on the organization’s ability to influence attitudes as well as intentions. Unlike training, where employees are quizzed on their knowledge of instructions, the focus of awareness training should be on changing behavior. In terms of making this happen, organizations should make clear to everyone on staff that cybersecurity adherence isn’t optional any longer. It’s strategic.


SAML Explained: What It Is, What It's Used For

authentication
In order for SSO to work, a user must be able to authenticate once and receive authorization, based on his or her confirmed identity, to access multiple other computers. This can also work the other way: a single computer may provide services to users authorized on multiple other computers. The SAML standard defines how all these computers communicate with each other securely. ... A SAML assertion is the XML document by which all the information we've been discussing is transmitted from one computer to another. Once an identity provider has determined that you are who you say you are and have the right to access the content or services you're interested in, it sends a SAML assertion to the server that actually can actually provide those services to you. A SAML assertion may be encrypted for increased security.


What to do when SQL servers can't keep up with data demands

If anything is likely to change SQL server performance in the next few years, it will be the introduction of 5G connectivity and cloud-based systems. First, the launch of 5G will enhance remote system connection, breaking down front line communication delays necessary for server-side operations. Cloud storage, on the other hand, will be a boon to scalability. As with cloud-based SaaS, cloud storage is regularly updated, eliminating network upgrade delays and preventing slowdowns caused by insufficient storage within the system. Additionally, many companies prefer to operate via the cloud for security and stability reasons. Even with replication throughout, onsite physical operating systems tend to be much more prone to damage or failure than cloud storage. With more companies moving to cloud-based storage systems, choosing the proper protocols will be more important than ever


The Java Evolution of Eclipse Collections

Eclipse Collections is a drop in replacement for the Java Collections framework. It has JDK-compatible List, Set and Map implementations with a rich API, as well as additional types not found in the JDK such as Bags, Multimaps and BiMaps. Eclipse Collections also has a full complement of primitive containers. It was developed internally at Goldman Sachs for 10 years before being open sourced in 2012 as GS Collections. In 2015, it was migrated to the Eclipse foundation, and since then, all active development for the framework was done under the Eclipse Collections name and repository.  ... Optional is one of the most popular new features for Java 8. From the Javadoc, "A container object which may or may not contain a non-null value. If a value is present, isPresent() will return true and get() will return the value".



Quote for the day:


"The task of leadership is not to put greatness into humanity, but to elicit it, for the greatness is already there." -- John Buchan