Daily Tech Digest - November 13, 2018

Colmena, an Architecture for Highly-Scalable Web Services


Cells are self-contained services that follow the hexagonal architecture. Each cell: Has a clear purpose and responsibility; Has some internal domain that represents, validates and transforms the cell’s state; Relies on a series of interfaces to receive input from (and send output to) external services and technologies; Exposes a public API (a contract stating what it can do). You can think of cells as very small microservices. In fact, we encourage you to try to make your cells as small as possible. In our experience, granulating your domain around entities and relationships helps you understand, test and maintain the codebase in the long run. In Colmena, changes to the domain are represented as a sequence of events. This sequence of events is append-only, as events are immutable (they are facts that have already taken place). In event sourcing, this sequence is called a “Source of truth”, and it provides: An audit log of all the actions that have modified the domain; The ability for other components (in the same or different cells) to listen to certain events and react to them.


How Millennials Should View the World of Data Science


So to summarize, here is what I feel MBA students (and business leaders) need to understand about the growing capabilities and power of Data Science: Data Science is a team sport that equally includes data engineers (who gather and prepare and enrich the data for advanced analytics), data scientists (who build analytic models that codify cause and effect and measure goodness of fit”), and business stakeholders; Embrace the “Thinking Like A Data Scientist” approach in order to determine what problems to target with data science and how to apply the resulting customer, product and operational insights to derive and drive business value; Understand how to collaborate with the data science team around the Hypothesis Development Canvas that cements the relationship between the organization’s business strategy and specific AI and Machine Learning efforts; and Gain a high-level understanding of “what” advanced analytic capabilities, such as deep learning, machine learning and reinforcement learning, can do in uncovering customer, product and operational insights buried in the organization’s data


Internet Explorer scripting engine becomes North Korean APT's favorite target in 2018

Microsoft became well aware of this component's security flaws many years ago. That's why, in July 2017, Microsoft announced that it was disabling the automatic execution of VBScript code in the latest IE version that was included with the Windows 10 Fall Creators Update, released in the fall of last year. That change meant that hackers couldn't use VBScript code to attack users via Internet Explorer in Windows 10. Microsoft also promised patches to disable VBScript code execution in IE versions on older Windows releases. That change stopped many cybercrime operations, but DarkHotel seems to have adapted to Microsoft's recent VBScript deprecation announcement. According to reports, DarkHotel apparently opted to use VBScript exploits embedded inside Office documents and did not target Internet Explorer users via the browser directly.


AMD continues server push, introduces Zen 2 architecture
As part of the news conference, AMD acknowledged that Zen 4 is “in design,” meaning still on paper. Given Zen 3 is due in 2020, don’t figure on seeing Zen 4 until 2022 or so. Beyond that, the company said only it would offer higher performance and performance per watt when compared to prior generations. It’s been a good few weeks for AMD and EPYC. Last week, Oracle announced it would offer bare-metal instances on Epyc, and today Amazon Web Services (AWS) announced that Amazon Elastic Compute Cloud (EC2) will use Epyc CPUs, as well, so customers can get access today to instances running on the AMD processors. Intel noted that it, too, has an extensive relationship with AWS. So, now AMD has license deals with all of the major server vendors (HPE, Dell, Lenovo, Cisco) and almost all of the major cloud vendors. It had previously announced deals with Microsoft Azure and China’s Baidu and Tencent.


A foundational strategy pattern for analysis: MECE

Architecture
MECE, pronounced "mee-see," is a tool created by the leading business strategy firm McKinsey. It stands for "mutually exclusive, collectively exhaustive," and dictates the relation of the content, but not the format, of your lists. Because of the vital importance of lists, this is one of the most useful tools you can have in your tool box. The single most important thing you can do to improve your chances of making a winning technology is to become quite good at making lists. Lists are the raw material of strategy and technology architecture. They are the building blocks, the lifeblood. They are the foundation of your strategy work. And they are everywhere. Therefore, if they are weak, your strategy will crumble. You can be a strong technologist, have a good idea, and care about it passionately. But if you aren’t practically perfect at list-making, your strategy will flounder and your efforts will fail. That’s because everything you do as you create your technology strategy starts its life as a list, and then blossoms into something else.


Many firms need more evidence of full benefits of artificial intelligence

Much of executives’ enthusiasm is justified. AI is already being deployed in a range of arenas, from digital assistants and self-driving cars to predictive analytics software providing early detection of diseases or recommending consumer goods based on shopping habits. A recent Gartner study finds that AI will generate $1.2 trillion in business value in 2018—a striking 70 percent increase over last year. According to Gartner, the number could swell to close to $4 trillion by 2022. This dramatic growth is likely reinforcing the perception among executives that such technologies can transform their respective industries. When looking at the external environment, encompassing economic, political, social, and other external developments that affect business, one-third of executives flagged positive technological disruption in their industry as a top opportunity.


Cylance researchers discover powerful new nation-state APT

group of hackers in digital environment
The malware didn't just evade antivirus detection, however, it let itself be discovered by different antivirus vendors on preprogrammed dates, likely as a distraction tactic. "What we've got here in this case is a threat actor who has figured out how to determine what antivirus is running on your system and deliberately trigger it in an attempt to distract you," Josh Lemos, vice president of research and intelligence at Cylance, says. "That should be concerning organizations outside of Pakistan." Kill switches in malware have been seen before, such as in Stuxnet, but Cylance researchers say they've rarely seen a campaign that deliberately surrenders itself to investigators in this manner. "The White Company...wanted the alarm to sound," their report concluded. "This diversion was likely to draw the target's (or investigator's) attention, time and resources to a different part of the network. Meanwhile, the White Company was free to move into another area of the network and create new problems."


Firms lack responsible exec for cyber security

According to the report, although more people see the need for regular boardroom discussions about security, their organisations are failing to raise it sufficiently at the C-suite level. While 80% of all survey respondents agree that preventing a security attack should be a regular boardroom agenda item (up from 73% a year ago) only 61% say that it already is, which represents an increase of just 5% on last year. The report also suggests this lack of cohesion at the top of the organisation means that many are struggling to secure their most important digital assets. Fewer than half (48%) of respondents globally – 53% in the UK – say they have fully secured all of their critical data. But with the General Data Protection Regulation (GDPR) now fully in effect, this is no longer an opportunity, but mandatory, the report notes. However, companies are beginning to take control of their data as cloud computing best practices mature, with 27% reporting that the majority of their organisation’s data is currently stored on premise or in datacentres (25%).


Avoiding Business Stasis by Modernizing Ops, Architecture & More


Fear is inevitable during any modernization growth spurt. For instance, the operations team may fear that an increase in automation will lead to the loss of human expertise. Re-architecting the software may be perceived by developers as a threat to well-defined traditional team scopes and organizations. For the business owner, a poorly executed modernization takes away resources and doesn’t lead to improved agility. The concern many folks voice when they don’t know how to run or create a platform is that they don’t know what their place will be in the new organization. But what has started to become clear to those participating in our modernization effort is that their skills are being expanded — not replaced. And that enables them to take on new roles in the organization. One of the fundamental things that’s happening at StubHub is a complete change in the way we think about new ideas. The change in our stack allows us to work in any language and because we fully expect to move beyond Java and get into Go and Ruby and node.js, we can innovate and rethink our future in more ways than ever before.


C language update puts backward compatibility first

C language update puts backward compatibility first
C is the foundation for many popular software projects such as the Linux kernel and it remains a widely used language, currently second in the Tiobe index. Its simplicity makes it a common choice for software applications that run at or close to bare metal, but developers must take extra care in C, versus higher-level languages like Python, to ensure that memory is managed correctly—easily the most common problem found in C programs. Previous revisions to the C standard added features to help with memory management—including the “Annex K” bounds-checking feature. However, one of the proposals on the table for C2x is to deprecate or remove the Annex K APIs, because their in-the-field implementations are largely incomplete, non-conformant, and non-portable. Alternative proposals include replacing these APIs with third-party bounds-checking systems like Valgrind or the Intel Pointer Checker, introducing refinements to the memory model, or adding new ways to perform bounds checking for memory objects.



Quote for the day:


"Leadership has a harder job to do than just choose sides. It must bring sides together." -- Jesse Jackson


Daily Tech Digest - November 12, 2018


Financial institutions that over time fail to utilise technology to engage effectively with increasing regulation neglect the changing environment around them. Attempting to meet the obligations set forth by regulators with manual processes make an organisation prone to human errors and slippage in flows between key functions and departments. In effect, regtech becomes the magic ingredient that enables scalability for financial institutions in an environment of increasing regulatory requirements.  At Saxo Bank, we are deploying new technologies such as machine learning and artificial intelligence to our regulatory framework e.g. to enhance financial crime detection procedures and automatically scan through thousands of transactions. Through machine learning, the algorithm is constantly improving and finding new patterns that would be difficult (or time-consuming) to do manually.  An important factor for any financial institution with regards to regtech is to collaborate with external partners and vendors. Saxo Bank’s regtech framework is built on the foundations of several external data vendors and partners whose systems and knowledge we leverage in our own offering.


Building an artificial general intelligence begins by asking 'what is intelligence?'

People often make seemingly irrational choices. When offered an early registration discount for a conference, only 67 percent of the graduate students took advantage of the offer. When told that there would a penalty for late registration, 93 percent of the students took the offer even though the costs and the cost differences were identical in the two situations ($50 discount or $50 penalty). We can think about decisions like these as being somehow abnormal, but they are very common and, more importantly, demonstrate just how people use heuristics to achieve their intelligence. When intelligence has been studied by psychologists, the focus has generally been on identifying individual differences. Intelligence testing started with Alfred Binet and Theodore Simon’s efforts to identify French school children who might require special help. Their focus was on those factors that would allow a child to do well in school.


DevOps and Databases


When working with whole-schema source control, you usually don’t write your migration scripts directly. The deployment tools figure out what changes are needed for you by comparing the current state of the database with the idealized version in source control. This allows you to rapidly make changes to the database and see the results. When using this type of tool, I rarely alter the database directly and instead allow the tooling to do most of the work.  Occasionally the tooling isn’t enough, even with pre- and post- deployment scripts. In those cases, the generated migration script will have to be hand-modified by a database developer or DBA, which can break your continuous deployment scheme. This usually happens when there are major changes to a table’s structure, as the generated migration script can be inefficient in these cases. Another advantage of whole-schema source control; it supports code analysis. For example, if you alter the name of a column but forget to change it in a view, SSDT will return a compile error.


Diligent Engine: A Modern Cross-Platform Low-Level Graphics Library


The next-generation APIs, Direct3D12 by Microsoft and Vulkan by Khronos are relatively new and have only started getting widespread adoption and support from hardware vendors, while Direct3D11 and OpenGL are still considered industry standard. New APIs can provide substantial performance and functional improvements, but may not be supported by older platforms. An application targeting wide range of platforms has to support Direct3D11 and OpenGL. New APIs will not give any advantage when used with old paradigms. It is totally possible to add Direct3D12 support to an existing renderer by implementing Direct3D11 interface through Direct3D12, but this will give zero benefits. Instead, new approaches and rendering architectures that leverage flexibility provided by the next-generation APIs are expected to be developed. There exist at least four APIs (Direct3D11, Direct3D12, OpenGL/GLESplus, Vulkan, plus Apple's Metal for iOS and osX platforms) that a cross-platform 3D application may need to support.


The Amazing Ways Google And Grammarly Use AI To Improve Our Writing


Just like with other machine learning algorithms, Grammarly's artificial intelligence system was originally provided with a lot of high-quality training data to teach the algorithm by showing it examples of what proper grammar looks like. This text corpus—a huge compilation human researchers organized and labeled so the AI could understand it—showed, as an example, not only proper uses of punctuation, grammar and spelling, but incorrect applications so the machine could learn the difference. In addition, Grammarly’s system uses natural language processing to analyze every nuance of language down to the character level and all the way up to words and full paragraphs of text. The feedback the system gets through humans when they ignore a proposed suggestion helps the system get smarter and provides the human linguists working with the machines input on how to make the system better. The more text it is exposed to, the better it can make appropriate suggestions. That's one of the reasons the company switched in 2010 to a consumer service from targeting enterprise customers so it would have access to a larger data set and a more significant opportunity.


RPA and its expansion into AI: Driving a new era of business and IT alignment

All businesses have some form of data pipeline feeding their supply chains and warehouses. They are designed to try to provide 100% of the data needed on a regular basis. While it’s usually adequate for reporting, it’s not a complete enough data set for analysis and insight generation. There is always a ‘last mile’ of supplementary analysis required to capture a specific piece of insight. This augments the data set with data to support root causes analysis of challenges such as month-end close for example. RPA can be used to support that last mile of extraction, providing the aggregation and data preparation to support the dynamic needs of reporting, without having to wait for corporate IT to extend the data pipelines. This in turn, enables us to predict and do things that have historically been difficult for humans. We struggle to predict because we can’t deal with the huge volumes of data. We struggle to narrate large volumes of data that cover a multitude of lines of divisions or departments.


The state of ICS and IIoT security in 2019


Industrial control systems (ICS) are designed to operate and support critical infrastructure. They are used heavily in industrial areas such as energy and utilities, oil and gas, pharmaceutical and chemical production, food and beverage, and manufacturing. Attacks on such systems can cause major damage. The 2015 hack of Ukraine’s power grid caused a blackout that affected over 200,000 people. Whether ransomware, botnets, cryptominers, or something more destructive, malware targeting such systems continues to proliferate. According to Kaspersky Labs, over 40 percent of ICS computers it monitors were attacked by malicious software at least once during the first half of 2018. .. “The data clearly shows that industrial control systems continue to be soft targets for adversaries,” said the report. “Many sites are exposed to the public internet and trivial to traverse using simple vulnerabilities like plain-text passwords. Lack of even basic protections like automatically updated anti-virus enables attackers to quietly perform reconnaissance before sabotaging physical processes such as assembly lines, mixing tanks, and blast furnaces.”


James Bach on a Career in Testing and Advice for New Players

We need to assess the value of testing, and that assessment is the process of observing people, talking to people, and essentially testing the test process. We need to help our clients understand our own testing and why it is valuable. That’s where the word “legibility” comes in. Legibility means the ability for something to be read. Handwriting is an obvious example of something that we speak of as being legible or illegible. But you can apply the concept of legibility is more than just handwriting. You can apply it to any process or system. A system is legible if you can look at it and tell what it going on with it. After 27 years of marriage, my wife’s moods are highly legible to me. I can tell in a few seconds how she is feeling. Unfortunately, testing is often not so easy to read as handwriting or people. That’s why testers must work to make their testing legible. They do this by using whiteboards or spreadsheets to make helpful displays. 


Lazarus 'FASTCash' Bank Hackers Wield AIX Trojan

Lazarus 'FASTCash' Bank Hackers Wield AIX Trojan
Symantec says that it's recovered multiple versions of the Fastcash Trojan, each of which appears to have been customized for different transaction processing networks. The samples also tie to legitimate primary account numbers, or PANs - the 14 or 16-digit numerical strings found on bank and credit cards that identify a card issuer and account number. US-CERT said in its alert that after reviewing log files recovered from an institution that had been attacked by Hidden Cobra, "analysts believe that the [hackers'] scripts ... inspected inbound financial request messages for specific [PANs]. The scripts generated fraudulent financial response messages only for the request messages that matched the expected PANs. Most accounts used to initiate the transactions had minimal account activity or zero balances." In other words, malicious code inserted by Hidden Cobra attackers watched for references tied to attacker-controlled accounts, then returned fraudulent information about those accounts in response to queries.


5 questions to ask about open data centers

Extreme’s definition of open essentially means no vendor lock-in. WorkFlow Composer can automate workflows across any vendor, including Arista, Cisco and Juniper. Extreme can integrate with more than 100 vendors that have integration packs on exchange.stackstorm.org. Customers may have to tweak the code some, but they do not have to start with a blank sheet of paper. StackStorm extends beyond networking, too. As a result, engineers who use Workflow Composer can extend the automation capabilities to things like Palo Alto and Check Point firewalls, VMware vSphere, ServiceNow’s service desk and others. You could argue the network is the foundation of a modernized data center as it provides the connectivity fabric between everything. But open data centers incorporate more than just networking. By building Workflow Composer on StackStorm, Extreme can orchestrate and automate workflows from the network to the application — and everything in between.



Quote for the day:


"The person who can drive themself further once the effort gets painful, is the one who will win." -- Roger Bannister


Daily Tech Digest - November 11, 2018

broken web app hacker
Web applications are the most visible front door to any enterprise and are often designed and built without strong security in mind. Stressing out over hardware vulnerabilities like Spectre or Meltdown is fun and trendy, but while you're digging a moat around your castle someone is prancing across the drawbridge using SQL injection (SQLi) or cross-site scripting (XSS). The OWASP Broken Web Applications Project comes bundled in a virtual machine (VM) that contains a large collection of deliberately broken web applications with tutorials to help students master the various attack vectors. From trivial to more difficult, the project is designed to lead the user to a better understanding of web application security. The OWASP Broken Web Applications Project includes the appropriately named Damn Vulnerable Web Application, deliberately broken for your pentesting enjoyment. For maximum lulz, download OWASP Zed Attack Proxy, configure a local browser to proxy traffic through ZAP, and get ready to attack some damn vulnerable web applications.



Emotional skill is key to success

According to Susan David, emotional agility is about adaptability, facing emotions and moving on from them. It is also the ability to master the challenges life throws at us in an increasingly complex world. She added that while emotional intelligence is not values-focused, emotional agility is. "Women do have some advantages in the domain of emotional agility," she said. "When I go into organisations and look at hotspots or business units that are extremely high functioning, what we find is that the most important predictor of enabling these units is what I call 'individualised considerations'. That means leaders who are able to see the individual as an individual and this has diversity at its core. "These leaders do not stereotype or exclude," she added. "Of course, this doesn't work always in practice and there is a lot of work to be done in this regard in organisations and businesses."


Hybrid Blockchain- The Best Of Both Worlds

Hybrid Blockchain
The hybrid blockchain is best defined as the blockchain that attempts to use the best part of both private and public blockchain solutions. In an ideal world, a hybrid blockchain will mean controlled access and freedom at the same time. The hybrid blockchain is distinguishable from the fact that they are not open to everyone, but still offers blockchain features such as integrity, transparency, and security. As usual, Hybrid blockchain is entirely customizable. The members of the hybrid blockchain can decide who can take participation in the blockchain or which transactions are made public. This brings the best of both worlds and ensures that a company can work with their stakeholders in the best possible way. We hope that you got a clear view from the hybrid blockchain definition. To get a much better picture, we recommend you to check out some hybrid blockchain projects.


How universities should teach blockchain


The core issue is that blockchain is really hard to teach correctly. There’s no established curriculum, few textbooks exist, and the field is rife with misinformation, making it hard to know what is credible. Protocols are evolving at a rapid pace, and it’s tough to tell the difference between a white paper and reality. Having so much attention around blockchain specifically frames it as a miraculous and novel development rather than an outgrowth of decades of computer science research. Matt Blaze, an associate professor at the University of Pennsylvania and a cyber-security researcher, points out that the push for degree programs in blockchain is part of a trend of overspecialization by some engineering schools. The concepts sound good on paper but don’t live up to their promise. Despite the best of intentions, trends change, and students get stuck in narrow career paths. In order to avoid these pitfalls, universities will have to take an approach they’re not used to.


Experience an RDP attack? It’s your fault, not Microsoft’s

Windows security and protection [Windows logo/locks]
If you are compromised because of RDP, the problem is you or your organization. It isn’t a problem with Microsoft or RDP. You don’t need to put a VPN around RDP to protect it. You don’t need to change default network ports or some other black magic. Just use the default security settings or implement the myriad other security defenses you should have already been using. If you’re getting hacked because of RDP, you’re not doing a bunch of things that any good computer security defender should be doing. There are many ransomware programs, like SamSam, and cryptominers, like CrySis, that attempt brute-force guessing attacks against accessible RDP services. So many companies have had their RDP services compromised that the FBI and Department of Homeland Security (DHS) have issued warnings. The warning should be, “Your security sucks!” It isn’t like the malware programs are conducting a zero-day attack against some unpatched vulnerability.


Data as a Driver of Economic Efficiency

The General Data Protection Regulation (GDPR) became enforceable on May 25, 2018. The regulation aims to protect data by ‘design and default,’ whereby firms must handle data according to a set of principles. GDPR mandates opt-in consent for data collection and assigns substantial liability risks and penalties for data flow and data processing violations. GDPR’s enactment is particularly likely to influence technology ventures, given an increasing need for the use of data as a core product input. Specifically, data has become a key factor in technology-driven innovation and production, spanning industry sectors from pharmaceuticals and healthcare, to automative, smart infrastructure, and broader decision making. This report presents economic analyses of the consequences of data regulation and opt-in consent requirements for investment in new technology ventures, for consumer prices, and for economic welfare.


A Two-Minute Guide To Quantum Computing

AP Explains Quantum Computers
Most of us aren't clued up on the art of harnessing elementary particles like electrons and photons, so to understand how quantum computing works, meet Scottish startup M Squared. The company’s bread and butter is making some of the most accurate lasers in the world, using pure light and precise wavelengths. Such lasers can be used like a scalpel, one atom wide, to carve out the transistors of a silicon chip.  Typically the chip or brain in your smartphone is a centimeter square. It has a small section in the middle made up of around 300 million transistors, with connections spreading out like fingers to talk to the screen, the camera, the battery and more.  But imagine a chip with no transistors at all, and instead a small chamber that’s controlling the processes and energy levels inside of atoms. This is quantum computing, the next frontier of machines that think not in bytes but in powerful qubits. It sounds cutting-edge, but scientists have been studying the theory of quantum computing for 30 years, and some say the first mainstream applications are just around the corner.


How Do Self-Driving Cars See? (And How Do They See Me?)


We’ll start with radar, which rides behind the car’s sheet metal. It’s a technology that has been going into production cars for 20 years now, and it underpins familiar tech like adaptive cruise control and automatic emergency braking. ... The cameras—sometimes a dozen to a car and often used in stereo setups—are what let robocars see lane lines and road signs. They only see what the sun or your headlights illuminate, though, and they have the same trouble in bad weather that you do. But they’ve got terrific resolution, seeing in enough detail to recognize your arm sticking out to signal that left turn. ... If you spot something spinning, that’ll be the lidar. This gal builds a map of the world around the car by shooting out millions of light pulses every second and measuring how long they take to come back. It doesn’t match the resolution of a camera, but it should bounce enough of those infrared lasers off you to get a general sense of your shape. It works in just about every lighting condition and delivers data in the computer’s native tongue: numbers.



Facial recognition's failings: Coping with uncertainty in the age of machine learning

The shortcomings of publicly available facial-recognition systems were further highlighted in summer this year, when the American Civil Liberties Union (ACLU) tested the AWS Reckognition service. The test found that 28 members of the US Congress were falsely matched with mug shots from publicly available arrest photos. Professor Chris Bishop, director of Microsoft's Research Lab in Cambridge, said that as machine learning technologies were deployed in different real-world locales for the first time it was inevitable there would be complications. "When you apply something in the real world, the statistical distribution of the data probably isn't quite the same as you had in the laboratory," he said. "When you take data in the real world, point a camera down the street and so on, the lighting may be different, the environment may be different, so the performance can degrade for that reason. "When you're applying [these technologies] in the real world all these other things start to matter."


Robots Have a Diversity Problem


It is well-documented that A.I. programs of all stripes inherit the gender and racial biases of their creators on an algorithmic level, turning well-meaning machines into accidental agents of discrimination. But it turns out we also inflict our biases onto robots. A recent study led by Christoph Bartneck, a professor at the Human Interface Technology Lab at the University of Canterbury in New Zealand, found that not only are the majority of home robots designed with white plastic, but we also actually have a bias against the ones that are coated in black plastic. The findings were based on a shooter bias test, in which participants were asked to perceive threat level based on a split-second image of various black and white people, with robots thrown into the mix. Black robots that posed no threat were shot more than white ones. “The only thing that would motivate their bias [against the robots] would be that they would have transferred their already existing racial bias to, let’s say, African-Americans, onto the robots,” Bartneck told Medium. “That’s the only plausible explanation.”



Quote for the day:


"Remember this: Anticipation is the ultimate power. Losers react; leaders anticipate." -- Tony Robbins


Daily Tech Digest - November 10, 2018

How the Blockchain Could Break Big Tech’s Hold on A.I.

Unlike Google and Facebook, which store the data they get from users, the marketplaces built on Ocean Protocol will not have the data themselves; they will just be places for people with data to meet, ensuring that no central player can access or exploit the data. “Blockchains are incentive machines — you can get people to do stuff by paying them,” said Trent McConaghy, one of the founders of Ocean Protocol, who has been working in artificial intelligence since the 1990s. The goal, Mr. McConaghy said, is to “decentralize access to data before it’s too late.” Ocean is working with several automakers to collect data from cars to help create the artificial intelligence of autonomous cars. All the automakers are expected to share data so none of them have a monopoly over it. Another start-up, Revel, will pay people to collect the data that companies are looking for, like pictures of taxis or recordings of a particular language. Users can also let their phones and computers be used to process and categorize the images and sounds


Unit Testing – Abstracting Creation of Simple Values

When writing your unit tests you can use your chosen mocking framework to provide a fake implementation of IDateTimeProvider that provides a static value for _dateTimeProvider.Now. This is a very practical pattern for many situations, especially if the requirements of the provider become more complex. However, there are some notable disadvantages to this approach. Firstly, the actual requirements here are very simple, so it could be considered overkill to create an extra class for the provision of a single date time object. Especially if you consider that you’ll also need to configure dependency resolution if you’re using an IOC container and instantiate mock objects in your tests. Maybe creating a provider object is more effort than it’s worth. Secondly, as noted with method injection it is reasonable to suggest that the responsibility of choosing and applying a timestamp should be with the DocumentService itself.


Linux cryptocurrency miners are installing rootkits to hide themselves

korkerds-installation.jpg
Besides allowing KORKERDS to survive OS reboots, the rootkit component also contained code a slightly strange feature. Trend Micro says that KORKERDS' authors modified the rootkit to hide the cryptominer's main process from Linux's native process monitoring tools. "The rootkit hooks the readdir and readdir64 application programming interfaces (APIs) of the libc library," researchers said. "The rootkit will override the normal library file by replacing the normal readdir file with the rootkit's own version of readdir." This malicious version of readdir works by hiding processes named "kworkerds" --which in this case is the cryptominers' process. Linux process monitoring tools will still show 100 percent CPU usage, but admins won't be able to see (and kill) the kworkerds process causing the CPU resource consumption problems.


Why Is Data Science Different than Software Development?

Because the exact variables and metrics (and their potential transformations and enrichments) are not known beforehand, the data science development process must embrace an approach that supports rapid testing, failing, learning, wash and repeat. This attitude is reflected in the “Data Scientist Credo”: Data science is about identifying those variables and metrics that mightbe better predictors of performance; to codify relationships and patterns buried in the data in order to drive optimized actions and automation. Step 5 of the Data Scientist Development Methodology is where the real data science work begins – where the data scientist uses tools like TensorFlow, Caffe2, H20, Keras or SparkML to build analytic models – to codify cause-and-effect. This is true science, baby!! The data scientist will explore different analytic techniques and algorithms to try to create the most predictive models.


The New Cross-Platform Standard: Version 2.0


Microsoft has a new approach: Standard Class Library projects. A Standard Class Library consists of those APIs that "are intended to be available on all .NET implementations." The news here is that there is only one Standard and it supports all the .NET platforms -- no more profiles agglomerated into an arbitrary set of interfaces. The catch here is that the Standard may not include something you want ... at least, not yet. With PCLs there was always the possibility that, if you dropped one of the platforms you wanted to support, you might pick up the API you wanted. That's not an option with the Standard, which is monolithic. In some ways it's like setting the version of the .NET Framework you want to support in your project's properties: The lower the version you pick, the less functionality you have. Obviously, then, what matters in the .NET Standard is comprehensiveness. There have been several iterations of the .NET Standard specification, each of which includes more .NET Framework APIs.


Increasing value of personal data a 21st century challenge


“Something had to be done, and if it has achieved nothing else, the EU’s General Data Protection Regulation has focused people’s minds and got company executives and board members to take this issue seriously because now they have to be accountable and declare breaches,” he said. This means data protection in Europe, said Shamah, is no longer just the concern of technical teams in organisations, but also chief executives and shareholders. “In the light of the recent revelations about the misuse of data, everyone needs to consider what kind of digital footprint they want to leave; a permanent one like those left by the first astronauts on the surface of the moon or temporary like those left in the sand on a beach.” The aim, he said, should be for digital footprints that last only for as long as they are needed and then erased without a trace.


Why employees’ lapses in protecting data can sting organizations


In assessing the facts in the URMC case, it seems like attention focused on the departing/departed nurse practitioner asking for a patient list, which was provided in spreadsheet form. More often, when an employee leaves, there is a clear acknowledgment that the employee is cut off from all of the employer’s patient information because HIPAA does not allow continued access. The seemingly voluntary transmission offers a plausible basis for fining an entity when the ultimate bad act was on the part of the departed employee. As such, the takeaway from the URMC case is to not be overly generous, as misuse of information can come back to haunt the organization. Ensuring the privacy and security of patient information needs to be a paramount concern at all times. While it is impossible to control all the actions of employees, organizations can and must take reasonable and appropriate action to secure information as much as possible.


Why open source isn't just about code

We've seen things like Firefox really succeed where people come together from all over the world to build a product openly, and invite contributions. And we've seen that succeed and really take down a monopoly. And we've seen this work, time and time again, in more than just code, but in businesses, in government, in science. Where people, when they work openly, when they're inviting contributions, they're more innovative, they get better ideas. And they get more buy-in from the community who wants to use them. ... If it's open source you can hear more from the people that are using it. Places like Lego actually use that, if they're thinking about what Lego line to produce next, they have surveys and people can suggest things. And company-creating. You get better innovation when more people, and the right experts, are really working on the products. There's a lot of different advantages. Those are three of them that I can think of now.


Dutch Police Bust 'Cryptophone' Operation

Dutch police say they discovered the cryptophone operation while investigating an alleged money laundering operation. Police didn't just shut down the network. Instead, they seized a server and began monitoring the service. "We had sufficient evidence that these phones were used among criminals. We have succeeded in intercepting encrypted communication messages between these phones, decrypting them and having them live for some time," Dutch police said on Tuesday. "This has not only given us a unique insight into existing criminal networks; we have also been able to intercept drugs, weapons and money." Police say their investigation has already allowed them to bust a drugs lab in Enschede, Netherlands, and make 14 arrests, including a 46-year-old man from Lingewaard who's suspected of running the cryptophone company, as well as his alleged partner, a 52-year-old man from Boxtel.


Data Lake and Modern Data Architecture in Clinical Research and Healthcare

The primary challenge in implementing a data lake architecture in healthcare has to do with making sure the data platform is architected with data security, privacy and protection in mind while enabling real time data transmission, collection, ingestion and integration at scale. Not to mention, challenges in dealing with unstructured and binary data in the data lake cannot be underestimated. From the data lake architecture perspective, supporting both batch and near time data integration and business intelligence are a real practical challenge. Making integrated data available to all constituents in a self-service manner is another big challenge. ... Our enterprise data lake is a consumer to our MDM platform, which collects all master entities from all of our operational and transactional systems, and masters them in real time using sophisticated matching and merging algorithms, metadata management and semantic matching.



Quote for the day:


"Individual commitment to a group effort - that is what makes a team work, a company work, a society work, a civilization work." -- Vince Lombardi


Daily Tech Digest - November 09, 2018

Cisco Accidentally Released Dirty Cow Exploit Code in Software


“A failure in the final QA validation step of the automated software build system for the Cisco Expressway Series and Cisco TelePresence Video Communication Server (VCS) software inadvertently allowed a set of sample, dormant exploit code used internally by Cisco in validation scripts to be included in shipping software images,” the company said in an advisory. “This includes an exploit for the Dirty CoW vulnerability (CVE-2016-5195). The purpose of this QA validation step is to make sure the Cisco product contains the required fixes for this vulnerability.” Cisco said that it is not aware of “malicious use of the issue” and that the issue does not open the impacted software (Cisco Expressway Series and Cisco TelePresence Video Communication Server image versions X8.9 through X8.11.3) to any sort of attack. “The impacted software images will be removed and will be replaced by fixed images,” the company said. It did not specify when.



The Role of a Manager Has to Change in 5 Key Ways

“First, let’s fire all the managers” said Gary Hamel almost seven years ago in Harvard Business Review. “Think of the countless hours that team leaders, department heads, and vice presidents devote to supervising the work of others.” Today, we believe that the problem in most organizations isn’t simply that management is inefficient, it’s that the role and purpose of a “manager” haven’t kept pace with what’s needed. For almost 100 years, management has been associated with the five basic functions outlined by management theorist Henri Fayol: planning, organizing, staffing, directing, and controlling. These have become the default dimensions of a manager. But they relate to pursuing a fixed target in a stable landscape. Take away the stability of the landscape, and one needs to start thinking about the fluidity of the target. This is what’s happening today, and managers must move away from the friendly confines of these five tasks.


Cloud, edge, and fog computing: understanding the practical application for each

Cloud, edge and fog computing image
Fog computing effectively “decentralises” computational and analytical power. It sits between your local equipment and mobile devices — equipment with limited processing power and storage, in other words — and provides a way to sift through streams of information from these and other components in your IoT. You can get a better mental image of fog computing by thinking about driverless automobiles navigating a city block. If the vehicles, their sensors, and their controllers are the “edge layer” for a city’s smart transportation system — we’ll get to edge computing in a moment — then there are likely micro-data centres alongside mesh routers and cell towers that serve as the “fog layer.” Fog computing isn’t quite as decentralised as the edge, but it does further reduce the amount of data transferred across the network or upwards into the cloud layer. It facilitates communication and collaboration between the “nodes” in the edge layer. In the example above, the nodes are the driverless cars.


Don’t make your cloud migration a house of cards

DonĂ¢€™t make your cloud migration a house of cards
The biggest architectural mistake that I see in the cloud involves coupling. Back in the day, applications were tightly coupled between other applications and data sources. If one thing stopped, the entire system stopped. So if the database went down, all connected applications did as well, including any systems that sent or received data from the database. Years ago, we learned that tight coupling was bad. It killed resiliency, scalability, and the ability to independently use resources such as applications, databases, and queues. Consultants like me gave presentations on it, and books were published on the topic, but IT organizations are still making the same architectural mistakes in 2018 that diminish the value of cloud computing. IT is not fixing things that are moving to the cloud that need fixing. At the core of the issue is money. Enterprises do not allocate enough funding to fix these issues before they move to the cloud. I assume the hope is that the issues won’t be noticed, or that the use of a more modern platform will magically fix the issues despite their poor architectures. 


deepfakes fake news tv head manipulation superimposed brainwashed
Seeing is believing, the old saw has it, but the truth is that believing is seeing: Human beings seek out information that supports what they want to believe and ignore the rest. Hacking that human tendency gives malicious actors a lot of power. We see this already with disinformation (so-called "fake news") that creates deliberate falsehoods that then spread under the guise of truth. By the time fact checkers start howling in protest, it's too late, and #PizzaGate is a thing. Deepfakes exploit this human tendency using generative adversarial networks (GANs), in which two machine learning (ML) models duke it out. One ML model trains on a data set and then creates video forgeries, while the other attempts to detect the forgeries. The forger creates fakes until the other ML model can't detect the forgery. The larger the set of training data, the easier it is for the forger to create a believable deepfake. This is why videos of former presidents and Hollywood celebrities have been frequently used in this early, first generation of deepfakes — there's a ton of publicly available video footage to train the forger.


The creation of one code base that is easy to maintain and publishes well across multiple OSes is no easy feat, said Jonathan Marston, director of software at Optimus Ride, a self-driving car company in Boston. Tools such as Adobe Air have tried and failed to achieve it, he said. "In the past, that dream has never lived up to the reality," Marston said. The ability to share code across multiple mobile OSes is getting more attainable with tools such as NativeScript and React Native, but the particular idiosyncrasies of each OS make it difficult to achieve complete code sharing, said Jesse Crossen, lead developer of VoiceThread, an education software company in Durham, N.C. For example, developers might want to write one set of code for an iOS visual component and another for an Android visual component, due to different screen sizes and resolutions. "You're always going to have that level of customization per platform or have [an app] that's a little bit generic," Crossen said.


While IoT is generally thought of in terms of consumer products, he pointed out that some IoT systems are widely used in the business context such as building management systems that control the heating, cooling, door locks and fire alarms. “It is important that businesses think about the IoT devices they have in their environments. The gap between IT and services often creates opportunities for technology to cause problems, and so there are some key questions businesses need to ask suppliers, retailers, hardware manufacturers so you know whether you are buying a good product or one full of security vulnerabilities.” Munro said he was able to buy a controller of a business management system online and was able to find vulnerabilities that could be exploited to discover the password of the embedded server that would enable an attacker to take complete control of the building management system.


Microsoft: .NET Core Is the Future, So Get Moving


"As we move forward into the future, with .NET Core 3, we're going to see some more workloads that we're going to be working on here, mainly Windows desktop," Massi said. "We're bringing Windows desktop workloads to .NET Core 3, as well as AI and IoT scenarios. "The big deal here is now that if you're a WinForms or WPF developer you can actually utilize the .NET Core runtime." It's still Windows, she said. It's still your Windows application framework for desktop apps, but developers will be able to take advantage of the .NET Core feature set, such as improved performance, side-by-side installs, language features and other innovations being made in the platform itself. "So that's kind of a big deal," Massi said. While .NET Core is about improved performance, self-contained .exe files for desktop application deployment flexibility and more, it also provides UI interop. "It's about, instead of totally rewriting your apps to take advantage of Windows 10 or more modern UI controls, we're making it so that you can use modern UI controls in WinForms and WPF -- that's what UI interop is," Massi said.



10 signs you may not be cut out for a systems analyst position

metamorworksistock-952679588.jpg
The ability to say "No" is important in managing all areas of life, but as a systems analyst, someday your job may depend on it. Suppose you're in a meeting with your boss, their boss, and management from the operations side. Someone tries to get you to commit, on the spot, to adding new functionality, and your boss is not interceding for you. Under pressure, many people would say "Yes" just to get out of the meeting. But if you don't know absolutely that you can do the project, within the time and budget required, resist the temptation to get them off your back temporarily. Agreeing to a task that turns out to be unreasonable is just a setup for failure. ... Saying "No" may prevent you from promising the impossible, but it's best to use the word sparingly. To succeed as a systems analyst, you'll need to think of yourself as an in-house consultant. The business needs IT tools to make money, and you have to figure out how to provide those tools. Work with your in-house customers to develop a plan you can say "Yes" to. Figure out what you need—more time, more money, more human or technical resources—and be prepared to back up your requests.


The security skills shortage: A golden opportunity for creative CISOs


The very shallow security skills talent pool has also led to another opportunity, one that serves to up-skill and empower in-house (and even outsourced) development teams. It is a known fact that most of the world’s highest-scale security breaches were made possible due to errors in the software code itself, and with the average breach costing in excess of US$3.6 million, it makes sense to examine the application security budget. It stands to reason that if developers remain untrained, the same mistakes will be made year after year, and the same reactive, expensive after-the-fact fixes will need to be applied. It seems a crazy way to burn through cash, all while an organization’s reputation as a security-conscious company goes down the drain. So, why not change it up and secure software from the start of production? Empowering development teams to write secure code is the golden opportunity for CISOs to seize proactive control over looming security issues, and where there is the chance for fast, easy and measurable improvements – for both security and development teams.



Quote for the day:


"Perhaps the ultimate test of a leader is not what you are able to do in the here and now - but instead what continues to grow long after you're gone" -- Tom Rath


Daily Tech Digest - November 08, 2018

Each private cubicle sits on short legs, enabling small warehouse robots to scuttle around underneath them. Then, the robots can pick up the cubes and move them around the office based on what each person and team needs for the day. For instance, if you have a day of heads-down work, you’d get assigned a private cubicle so you can focus. If you have a day full of meetings, and you don’t need private space, your cube combines with other cubes to create a larger space in which to work with your colleagues. The robots shift the office in real time to make this happen. ... For now, the idea seems farfetched, but Rapt’s design principal and CEO David Galullo believes it’s closer than you might think. He says the studio is working with clients who are interested in how a workplace can be reconfigured over a weekend to respond to a team’s changing needs. The key is to keep the office as spare as possible, so you can easily move things around, which he believes is one reason that many companies prefer an open plan


HSBC Bank Alerts US Customers to Data Breach
An HSBC spokeswoman tells Information Security Media Group that less than 1 percent of HSBC's U.S. customers were affected by the data breach. The bank declined to quantify how many U.S. customers it has. But The Telegraph reports that HSBC manages about 1.4 million U.S. accounts, meaning 14,000 customers may have been affected. "HSBC regrets this incident, and we take our responsibility for protecting our customers very seriously," the bank says in a statement sent to ISMG. "We responded to this incident by fortifying our log-on and authentication processes, and implemented additional layers of security for digital and mobile access to all personal and business banking accounts," the statement notes. "We have notified those customers whose accounts may have experienced unauthorized access and are offering them one year of credit monitoring and identify theft protection service." HSBC's data breach notification to victims also notes: "You may have received a call or email from us so we could help you change your online banking credentials and access your account."


What Makes SSDs Different?

An SSD in a laptop will often go for long periods of time without any IO. It has plenty of time to perform garbage collection and similar functions. An enterprise SSD however, may face a full-time 24×7 workload and never have idle time for garbage collection type of functions but in the enterprise, it is consistent performance which is more important than peak levels of performance. Enterprises need SSD suppliers to create drives that focus more on the consistent delivery of IO (or IOPS) all the time no matter how heavy the workload rather than peak levels of performance that look good on a marketing datasheet. The key challenge to delivering consistent performance is how the SSD handles write IO, especially under heavy random workloads. With each write, flash media needs to find available space to place that write. If there is no space available it has to make space “on the fly,” by rearranging data within cells to create contiguous space for the new write. Garbage collection routines are supposed to make this space available in advance, but they are not always afforded the time to complete their tasks.


Java and MongoDB 4.0 Support for Multi-Document ACID Transactions

MongoDB 4.0 adds support for multi-document ACID transactions. But wait... Does that mean MongoDB did not support transactions until now? No, actually MongoDB has always supported transactions in the form of single document transactions. MongoDB 4.0 extends these transactional guarantees across multiple documents, multiple statements, multiple collections, and multiple databases. What good would a database be without any form of transactional data integrity guarantee? ... Multi-document ACID transactions in MongoDB are very similar to what you probably already know from traditional relational databases. MongoDB’s transactions are a conversational set of related operations that must atomically commit or fully rollback with all-or-nothing execution. Transactions are used to make sure operations are atomic even across multiple collections or databases. Thus, with snapshot isolation reads, another user can only see all the operations or none of them.


Seth James Nielson on Blockchain Technology for Data Governance


Data security and data management are much more complicated. Every member of the Blockchain must preserve and protect a private key. If that key is ever compromised by an unauthorized party, there is little that can be done to revoke the compromised key. Perhaps just as bad, if the key is lost (e.g., accidentally deleted), that user's access to the system is permanently lost as well. It is estimated, for example that 20% of all the Bitcoins in the world are lost in this manner. Finally, by itself, Blockchain doesn't really offer much for data management. Rather, it enables new forms of data management. Supply chain is a great example where Blockchain appears to be having some great success. When you look at world-wide, complicated supply chains, keeping track of data between hundreds, or even thousands, of inter-operating vendors is extremely challenging. Creating a Blockchain for these participants to create data of their own, and track related data of others, is a fantastic fit.


Strange snafu misroutes domestic US Internet traffic through China Telecom

Strange snafu misroutes domestic US Internet traffic through China Telecom
The sustained misdirection further underscores the fragility of BGP, which forms the underpinning of the Internet's global routing system. In April, unknown attackers used BGP hijacking to redirect traffic destined for Amazon’s Route 53 domain-resolution service. The two-hour event allowed the attackers to steal about $150,000 in digital coins as unwitting people were routed to a fake MyEtherWallet.com site rather than the authentic wallet service that got called normally. When end users clicked through a message warning of a self-signed certificate, the fake site drained their digital wallets. ... “While one may argue such attacks can always be explained by ‘normal’ BGP behavior, these, in particular, suggest malicious intent, precisely because of their unusual transit characteristics—namely the lengthened routes and the abnormal durations,” the authors wrote. The Canada to South Korea leak, the report said, lasted for about six months and started in February 2016.


Powerful $39 Raspberry Pi clone: Rock Pi 4

rockpi4banglesd.png
As mentioned, the processor is relatively capable for the price, with a dual-core 2.0GHz Arm Cortex-A72 paired with a quad-core 1.5GHz Arm Cortex-A53 in a Big.LITTLE configuration, which swaps tasks between cores for power efficiency. Smooth 4K video playback should be possible courtesy of the HDMI 2.0 port and Mali-T864 GPU. Fast SSD storage is also an option, via an M.2 interface supporting up to a 2TB NVMe SSD, and if the onboard SD card storage is too slow, there's an option to add up to 128GB eMMC storage to the board. Though the memory is relatively fast — 64-bit, dual-channel 3,200Mb/s LPDDR4 — only 1GB is available on the base $39 model, ranging up to 4GB for $65. There's a decent selection of ports, with four USB Type-A ports, one USB 3.0 host, one USB 3.0 OTG, and two USB 2.0 host. For those interested in building their own homemade electronics, there's also a 40-pin expansion header for connecting to boards, sensors and other hardware. Though this header's pin layout is similar to that of the Pi, the Rock Pi's maker said it wasn't possible to make it "100% GPIO compatible".


Banks in the changing world of financial intermediation

Banks in the changing world of financial intermediation
The dual forces of technological (and data) innovation and shifts in the regulatory and broader sociopolitical environment are opening great swaths of this financial-intermediation system to new entrants, including other large financial institutions, specialist-finance providers, and technology firms. This opening has not had a one-sided impact nor does it spell disaster for banks. Where will these changes lead? Our view is that the current complex and interlocking system of financial intermediation will be streamlined by the forces of technology and regulation into a simpler system with three layers.  ... Our view of a streamlined system of financial intermediation, it should be noted, is an “insider’s” perspective: we do not believe that customers or clients will really take note of this underlying structural change. The burning question, of course, is what these changes mean for banks.


The Growing Significance Of DevOps For Data Science


New datasets result in training and evolving new ML models that need to be made available to the users. Some of the best practices of continuous integration and deployment (CI/CD) are applied to ML lifecycle management. Each version of an ML model is packaged as a container image that is tagged differently. DevOps teams bridge the gap between the ML training environment and model deployment environment through sophisticated CI/CD pipelines. When a fully-trained ML model is available, DevOps teams are expected to host the model in a scalable environment. ... The rise of containers and container management tools make ML development manageable and efficient. DevOps teams are leveraging containers for provisioning development environments, data processing pipelines, training infrastructure and model deployment environments. Emerging technologies such as Kubeflow and MlFlow focus on enabling DevOps teams to tackle the new challenges involved in dealing with ML infrastructure.


Legacy Apps - Dealing with IFRAME Mess (Window.postMessage)

In the old days, iframes were used a lot. Not only for embedding content from other sites, cross domain Ajax or hacking an overlay that covered selects but also to provide boundaries between page zones or mimic desktop-like windows layout… Window.postMessage method was introduced into browsers to enable safe cross-origin communication between Window objects. The method can be used to pass data between iframes. In this post, I’m assuming that the application with iframes is old but it can be run in Internet Explorer 11, which is the last version that Microsoft released (in 2013). From what I’ve seen, it’s often the case that Internet Explorer has to be supported but at least it’s the latest version of it. ... Thanks to postMessage method, it’s very easy to create a mini message bus so events triggered in one iframecan be handled in another if the target iframe chooses to take an action. Such approach reduces couplingbetween iframes as one frame doesn't need to know any details about elements of the other



Quote for the day:


"If you only read the books that everyone else is reading, you can only think what everyone else is thinking" -- Haruki Murakami