Daily Tech Digest - July 13, 2019

Zoom vulnerability reveals privacy issues for users

On top of this, this vulnerability would have allowed any webpage to DOS (Denial of Service) a Mac by repeatedly joining a user to an invalid call," Leitschuh added. "Additionally, if you've ever installed the Zoom client and then uninstalled it, you still have a localhost web server on your machine that will happily re-install the Zoom client for you, without requiring any user interaction on your behalf besides visiting a webpage." According to Leitschuh, it took Zoom 10 days to confirm the vulnerability and in a meeting on June 11, he told Zoom there was a way to bypass the planned fix, but Zoom did not address these concerns when Zoom reported the vulnerability fixed close to two weeks later. The Zoom vulnerability resurfaced on July 7, Leitschuh disclosed on July 8 and Zoom patched the Mac client on July 9. Zoom also worked with Apple on a silent background update for Mac users, released July 10, which removed the Zoom localhost from systems. "Ultimately, Zoom failed at quickly confirming that the reported vulnerability actually existed and they failed at having a fix to the issue delivered to customers in a timely manner," Leitschuh wrote.


What is enterprise? What is architecture?

What exactly was ‘the enterprise’ in that context? If we think of ‘enterprise’ as organisation’, was it just NASA? Or did it include others in the consortium of US organisations that designed and built the launchers, the landers, all of the equipment for the missions? Did it include the broader international consortium of support-facilities such as ‘The Dish‘ at Parkes, New South Wales, through which the TV transmissions for the first moonwalk came? Did it include ‘competitors’ such as the Russian, Chinese and other national space-organisations? If we think of ‘enterprise’ as ‘a bold endeavour’, what was the respective endeavour? Just one moon-landing mission? – or all of them, from Apollo-11 to Apollo-17? Should we include all the other Apollo missions? All of the other US space-missions, before and after Apollo? To what extent would we include other moon-missions from other nations in the same enterprise? Space-explorations in general? Only crewed-missions, or robotic missions as well? Or could and should we extend this enterprise to include the overall story? – the dream of spaceflight and suchlike?


No, 5G isn't going to make your 4G LTE phone obsolete


"This is the first time so many aspects of [the old and new network] are shared," saidGordon Mansfield, AT&T vice president for converged access and device technology. "Some things we'll do for 5G are inherently backward compatible and will lift the capabilities of 4G." By 2025, 15% of mobile connections in the world will be on 5G, according to a 2019 report by GSMA Intelligence, the research arm of the mobile operator group that hosts Mobile World Congress. But LTE usage will be about 59% by the same year, up from 43% in 2018. (In North America, the split will be more even, with about 47% of 2025's connections on 5G and 44% on 4G). Even if 5G becomes an even bigger part of the market by 2025 than estimated today, "it will complement rather than replace LTE," GSMA said in a separate report from last year.  "For operators in many parts of the world, LTE is and will be the foundation for the next 10 years at least," the GSMA report said. "LTE speeds are improving, which makes 5G less compelling without new services such as AR/VR." 


IT strategy: The CIO’s guide to getting stuff done


Shaun Le Geyt, CIO at Parkinson's UK, is part of a senior digital leadership team who are driven by using technology to help people with the condition. Every hour, two people in the UK are told they have Parkinson's. As many as 145,000 people are diagnosed with the condition in the UK, which is around one in every 350 adults. Le Geyt says CIOs who want to meet their targets must focus on the people who will benefit, rather than focusing on the technology they're implementing. "Managing cultural change is as important as managing technological change," he says. Taking control of that change process is as true for internal staff in his organisation as it is for the individuals who benefit from the charity's work. Parkinson's UK has annual income of about £40m and employs 450-plus staff. The charity draws on a dynamic network of expert staff, health and social care professionals, volunteers, and researchers.  "Focus on the needs of the organisation – put people first," says Le Geyt. "There's times when technology comes up and you know it's the right thing to discuss.


DevOps for networking hits chokepoints, tangled communications


Challenges will arise with tighter integration of DevOps and networking. Network automation lags behind other areas of IT automation -- the networking team will argue that this is because automation decreases visibility into this vital component of the application infrastructure. When applications run in the public cloud -- as is often the case with DevOps deployments -- the admins cannot touch the physical network. Instead, the focus shifts to virtual private clouds, autoscaling and failover via setup policies. Cloud certification training can help developers take over these tasks, but developers do not have expertise in network management, just as network admins do not have experience running cloud operations. There's a cultural challenge set up by this change to NetOps. DevOps teams have to collaborate closely with both the in-house network operations team and their cloud service provider. Appoint in-house technical and strategic alliance managers, and request one from a cloud provider to build relationships and overcome these obstacles.


How to learn from inevitable AI failures

cloudai.jpg
Partly this is a problem of skills: To do well with AI or any area of big data, you need a mix of math, programming, and more. That kind of unicorn doesn't readily gallop by. However, it's also the case that finding someone who understands data science may be easier than finding someone who understands your business and the data that makes it hum. This calls to mind Gartner analyst Svetlana Sicular's advice from years ago about big data: "Organizations already have people who know their own data better than mystical data scientists." Therefore, look within your organization because "Learning Hadoop is easier than learning the company's business." Many AI projects fail precisely because the technology is considered in a vacuum. As noted by Greg Satell in Harvard Business Review, any AI project should have a clear business outcome identified, with the right data culled to serve that end. This, in turn, requires (you guessed it!) involving smart folks within the enterprise who understand the business intimately and know where to find the best data. AI, in other words, while ostensibly about replacing people, can't succeed without involving your company's best people.


Asia’s AI agenda: The ethics of AI

Asia's AI agenda: The ethics of AI
Asia’s AI ecosystem participants are aware of and concerned about the potential for embedded biases (race, gender, or socioeconomic status) within AI tools, and the harm this can cause through facilitating overpolicing of minority communities, or economic exclusion. Weaponization and malicious use of AI are also ethical concerns in Asia as applications are increasingly commoditized and industrialized. While Asian decision-makers are concerned about a potentially negative impact, particularly where jobs are concerned, optimism is the more dominant sentiment, which will propel the use of AI in Asia. Asian governments are building institutional capacity and frameworks to increase AI governance—but have yet to develop regulations. Overwhelmingly, more survey respondents believe Asia will lead the world in the development of ethics and governance than any other region: 45%, as compared with only a quarter who see North America as the ethics frontrunner.


The Market Of One

market of one
Smart, agile companies like Lifedata.ai are keeping up with such data developments by recognising that people expect a brand experience that matches their digital lifestyles. Today’s consumers demand improved ease of use, relevance and personalisation, delivered across channels in a frictionless and consistent way. Social and media interactions can be mediated by software and turn IoT sensor data into monetisable experiences. These can unlock context-aware personalisation, obtaining behavioural insights and real-time automation based on a timeline of contextual moments, situational context and relevant behavioural profiles. Lifedata.ai aims to capture how people go through their day and identify moments that matter in order to improve their lives. Omar Fogliadini, Managing Partner of Lifedata.ai, says, “In 2019, integrations with digital assistants are no longer a differentiator. Brands will need to showcase what people can actually do with these integrations. The most successful integrations will be those that make people’s lives easier and help them get things done. ...”


Man Vs. Machine: The 6 Greatest AI Challenges To Showcase The Power Of AI

Man Vs. Machine: The 6 Greatest AI Challenges To Showcase The Power Of Artificial Intelligence
Could artificial intelligence play Atari games better than humans?DeepMind Technologies took on this challenge, and in 2013 it applied its deep learning model to seven Atari 2600 games. This endeavor had to overcome the challenge of reinforcement learning to control agents directly from vision and speech inputs. The breakthroughs in computer vision and speech recognition allowed the innovators at DeepMind Technologies to develop a convolutional neural network for reinforcement learning to enable a machine to master several Atari games using only raw pixels as input and in a few games have better results than humans. Next up in our review of man versus machine is the achievements of AlphaGo, a machine that is able to learn for itself what knowledge is. The supercomputer was able to learn 3,000 years of human knowledge in a mere 40 days prompting some to claim it was “one of the greatest advances ever in artificial intelligence.” The system had already learned how to beat the world champion of Go, an ancient board game that was once thought to be impossible for a machine to decipher.


Is Enterprise Architecture Relevant To Agile?

Is Enterprise Architecture Relevant To Agile?
The first important insight is that EA is valuable to determine the future of pivotal Agile projects. It provides better vision to realize, identify the application and projects which are needed to support this vision. In EA, applications can be introduced as a black box. The Agile Project can open this black box. Agile projects can refine the high-level business requirements into the epics and the user stories in EA. Another important insight is that the focus of Agile will only be the teams, and not the enterprise. Dean Leffingwell designed the Scaled Agile Framework for small teams and it does not scale to the enterprise level. Enterprise Architects are also working under this framework. The responsibilities of the enterprise architect constitute of maintaining the goals, facilitating reuse of emerging solutions, knowledge and patterns. Finally, Agile and Scrum can be considered as enterprise architecture. They can be illustrated in the form of principles and models, core elements of architecture.



Quote for the day:


"The role of leadership is to transform the complex situation into small pieces and prioritize them." -- Carlos Ghosn


Daily Tech Digest - July 12, 2019

Reinforcement learning is an area of machine learning that has received lots of attention from researchers over the past decade. Benaich and Hogarth define it as being concerned with "software agents that learn goal-oriented behavior by trial and error in an environment that provides rewards or penalties in response to the agent's actions (called a "policy") towards achieving that goal." A good chunk of the progress made in RL has to do with training AI to play games, equaling or surpassing human performance. StarCraft II, Quake III Arena and Montezuma's revenge are just some of those games. More important than the sensationalist aspect of "AI beats humans", however, are the methods through which RL may reach such outcomes: Play driven learning, simulation and real-world combination, and curiosity-driven exploration. Can we train AI by playing games? As children, we acquire complex skills and behaviors by learning and practicing diverse strategies and behaviors in a low-risk fashion, i.e., play time. 


APT Groups Make Quadruple What They Spend on Attack Tools

"The potential benefit from an attack far exceeds the cost of a starter kit, says Leigh-Anne Galloway, cybersecurity resilience lead at Positive Technologies. For groups like Silence, the profit from one attack is typically more than quadruple the cost of the attack toolset, she says. The ROI for some APT groups can be many magnitudes higher. Positive Technologies, for instance, estimated that APT38, a profit-driven threat group with suspected backing from the North Korean government, spends more than $500,000 for carrying out attacks on financial institutions but gets over $41 million in return on average. A lot of the money that APT38 spends is on tools similar to those used by groups engaged in cyber espionage campaigns. Building an effective system of protection against APTs can be expensive, Galloway says. For most organizations that have experienced an APT attack, the cost of restoring infrastructure in many cases is the main item of expenditure. "It can be much more than direct financial damage from an attack," she says.


Smarter IoT concepts reveal creaking networks

Industry 4.0 - industrial IoT internet of things
"The internet, as we know it, is based on network architectures of the 70s and 80s, when it was designed for completely different applications,” the researchers say in their media release. The internet has centralized security, which causes choke points, and and an inherent lack of dynamic controls, which translates to inflexibility in access rights — all of which make it difficult to adapt the IoT to it. Device, data, and process management must be integrated into IoT systems, say the group behind the project, called DoRIoT (Dynamische Laufzeitumgebung für Organisch (dis-)Aggregierende IoT-Prozesse), translated as Dynamic Runtime Environment for Organic dis-Aggregating IoT Processes. “In order to close this gap, concepts [will be] developed in the project that transparently realize the access to the data,” says Professor Sebastian Zug of the University of Freiberg, a partner in DoRIoT. “For the application, it should make no difference whether the specific information requirement is answered by a server or an IoT node.”


Managing Third-Party Risks: CISOs' Success Strategies

Managing Third-Party Risks: CISOs' Success Strategies
As more organizations rely on third parties for various services, managing the security risks involved is becoming a bigger challenge. Those risks, indeed, can be significant. For example, earlier this year, Indian IT outsourcing giant Wipro was targeted by hackers who in turn launched phishing attacks against its customers. Among the toughest third-party risk management challenges are: Keeping track of the long list of outsourcers an organization uses and making sure they're assessed for security; Taking steps to minimize the amount of sensitive data that's shared with vendors - and making sure that data is adequately protected; and Holding vendors to a uniform standard for security. "For most organizations, there is still a long way to go in strengthening governance when it comes to vendor management," says Jagdeep Singh, CISO at InstaRem, a Singapore-based fintech company. "We need to look at the broader risk posture that vendors bring in ... which will determine the sort of due diligence you want to carry out."


To encourage an Agile enterprise architecture, software teams must devise a method to get bottom-up input and enforce consistency. Apply tenets of continuous integration and continuous delivery all the way to planning and architecture. With a dynamic roadmap, an organization can change its planning from an annual endeavor to a practically nonstop effort. Lufthansa Systems, a software and IT service provider for the airline industry under parent company Lufthansa, devised a layered approach to push customer demand into product architecture planning. Now, the company can continuously update and improve products, said George Lewe, who manages the company's roster of Atlassian tools that underpin the multi-team collaboration. "We get much more input from the customers -- really cool ideas," Lewe said. "Some requests might not fit into our product strategy or, for technical reasons, it's not possible, but we can look at all of them." Lufthansa Systems moved its support agents, product managers and software developers onto Atlassian Jira, a project tracking tool, with a tiered concept. 


What does the death of Hadoop mean for big data?

The Hadoop software framework, which facilitated distributed storage and processing of big data using the MapReduce programming model, served these data ambitions sufficiently. The modules in Hadoop were developed for computer clusters built from commodity hardware and eventually also found use on clusters of higher-end hardware. But the broader adoption of the open-source distributed storage technology that was invented by Google, however, did not come to be, as enterprises began opting to move to the cloud and explore AI, which included machine learning and deep learning as part of their big data initiative. Worse, several big Hadoop-based solution providers that had been unprofitable for years were forced to merge to minimize losses, and one may be forced shut down altogether. However, the questions remain if the fate of these vendors is only indicative of the demise of Hadoop powered solutions and other open source data platforms, or the death of big data as a whole? Was big data merely a fad or a passing interest of industries?


From Machine Learning to Machine Cognition  

Image 4 for From Machine Learning to Machine Cognition
Keeping logic/decisions outside network is what has been done by now. For decisions, we are using automated systems bases on software running on CPUs instead of artificial cognitive networks. While these work very well and will still be present for a long time, they are limited. Basically, these programs perform simple iterative tasks or move controls and numbers on monitor windows with millions of lines of code. This approach may be good while dealing with games and simple narrow tasks but not great when dealing with general concepts. They will not ensure enough internal connections. These will hardly evolve to intelligence. The complexity required is just too high to emulate imagination, intuition etc. Image recognition has been developed with neural networks because it was impossible to generate an iterative algorithm for it. The same should be done with cognition; decisions should use neurons, cognition should be kept inside network together with concepts and learning, as they have common neurons.


Open-Source Tool Lets Anyone Experiment With Cryptocurrency Blockchains

In researching blockchains, Shudo and his colleagues searched for a simulator that would help them experiment with and improve the technology. But existing simulators were too hard to use and lacked the features the team wanted. Moreover, these simulators had apparently been created for specific research and were abandoned soon after that work was completed, because many of the tools the group found were no longer being updated.  "The most recent simulator we looked at was developed in October 2016," says Shudo. "And it was no longer being maintained." So, the group developed its own simulator. Dubbed SimBlock, it runs on any personal computer supporting Java and enables users to easily change the behavior of blockchain nodes. Consequently, investigating the effects of changed node-behavior has now become a straightforward matter, says Shudo. "All the parameters of the nodes in SimBlock are written in Java," he explains. "These source files are separated from the main SimBlock Java source code, so the user simply edits [the nodes’] source code to change their behavior."


Visual Studio Code: Stepping on Visual Studio̢۪s toes?
Microsoft describes Visual Studio as a full-featured development environment that accommodates complex workflows. Visual Studio integrates all kinds of tools in one environment, from designers, code analyzers, and debuggers to testing and deployment tools. Developers can use Visual Studio to build cloud, mobile, and desktop apps for Windows and MacOS.  Microsoft describes Visual Studio Code, on the other hand, as a streamlined code editor, with just the tools needed for a quick code-build-debug cycle. The cross-platform editor complements a developer’s existing tool chain, and is leveraged for web and cloud applications. But while Microsoft views the two tools as complementary, developers have been raising questions about redundancy for years. Responses to a query in Stack Overflow, made four years ago, sum up the differences this way: Visual Studio Code is “cross-platform,” “file oriented,” “extensible,” and “fast,” whereas Visual Studio is “full-featured,” “project and solution oriented,” “convenient,” and “not fast.”


Attacks against AI systems are a growing concern


The continuing game of “cat and mouse” between attackers and defenders will reach a whole new level when both sides are using AI, said Hypponen, and defenders will have to adapt quickly as soon as they see the first AI-enabled attacks emerging. But despite the claims of some security suppliers, Hypponen told Computer Weekly in a recent interview that no criminal groups appear to be using AI to conduct cyber attacks. The Sherpa study therefore focuses on how malicious actors can abuse AI, machine learning and smart information systems. The researchers identify a variety of potentially malicious uses for AI that are already within attackers’ reach, including the creation of sophisticated disinformation and social engineering campaigns. Although the research found no definitive proof that malicious actors are currently using AI to power cyber attacks, as indicated by Hypponen, the researchers highlighted that adversaries are already attacking and manipulating existing AI systems used by search engines, social media companies, recommendation websites, and more.



Quote for the day:


"Leadership is a matter of having people look at you and gain confidence, seeing how you react. If you're in control, they're in control." -- Tom Landry


Daily Tech Digest - July 11, 2019

How IoT is reshaping network design

IoT
In a world of always-on ubiquitous connectivity, latency and reliability loom over everything, whether you’re talking about self-driving cars or Industry 4.0. These two challenges are driving much of the change that we’ll see in network design over the next few years. If the industry is to realize the promised benefits of IoT, we must increase the ability to support more machine-to-machine communications in near-real time. In applications like autonomous vehicles, latency requirements are on the order of a couple of milliseconds. GSMA, the international association for mobile technologuy, has specified that 5G's latency should be 1 millisecond, which is 50 times better than 4G's current 50 milliseconds. Satisfying these requirements involves a radical rethink about how and where we deploy assets throughout the network. For example, routing and backing up data using a traditional star-type network design will become increasingly unfeasible. The vast amount of traffic and the latency demands would easily overwhelm a north-south data flow.



Cyber security will always be an issue, “until we get rid of passwords” — Frank Abagnale Jr

The password is insecure: a hacker could log into an individual’s bank account and they wouldn’t even know. This is first issue; passwords are easily lost and even more easily stolen, via phishing or malware attacks. Once a cybercriminal has access to the password, they can replay it over and over gain. “Unfortunately, because passwords are free and easy, no one gave design much thinking,” said Mr Eisen. “But, now the cost of passwords is obvious” — they’re the great security vulnerability and largely responsible for the data breaches that pepper news headlines. Historically, security and user experience have been at odds with each other, because everyone believed that making systems less user friendly (longer, more complex passwords, for example) made them more secure — this is a fallacy and hinders adoption rates, making systems, ironically, less secure. “This is not a computer-to-computer interaction with longer keys. These are humans we’re talking about,” continued Mr Eisen.


Logitech wireless USB dongles vulnerable to new hijacking flaws

Logitech USB dongle
The vulnerabilities allow attackers to sniff on keyboard traffic, but also inject keystrokes (even into dongles not connected to a wireless keyboard) and take over the computer to which a dongle has been connected. When encryption is used to protect the connection between the dongle and its paired device, the vulnerabilities also allow attackers to recover the encryption key. Furthermore, if the USB dongle uses a "key blacklist" to prevent the paired device from injecting keystrokes, the vulnerabilities allow the bypassing of this security protection system. Marcus Mengs, the researcher who discovered these vulnerabilities, said he notified Logitech about his findings, and the vendor plans to patch some of the reported issues, but not all. According to Mengs, the vulnerabilities impact all Logitech USB dongles that use the company's proprietary "Unifying" 2.4 GHz radio technology to communicate with wireless devices.



Financial Firms Face Threats from Employee Mobile Devices

Instead of malware, criminals are using phishing attacks to gain access to financial services networks, but not just any attacks. "We're seeing more targeted attacks within financial services instead of kind of the scattershot approach where you send out a phishing attack to everybody in the organization," he explains. The success of phishing attacks on mobile devices in financial services may be part of a larger pattern of risky mobile behavior by those in the industry. According to the report, 42% of the organizations represented had devices with "side-loaded" apps — apps downloaded and installed from sites other than the app stores approved for the device. Covington says, "You start to see the implications of letting employees manage their own device." And those employees are managing their devices in tremendous numbers, he says. Employee-owned devices, used to conduct company business, are targets because of the sensitive data they contain. "There's no doubt in my mind that the criminal side of the equation is after rich data," he says.


Digital skills — key to driving UK prosperity post-Brexit, according to Salesforce

Digital skills — key to driving UK prosperity, according to Salesforce image
The data from the Salesforce report highlights concerns of a potential shortage of tech skills post-Brexit, with over half of business leaders believing the UK is at risk of a tech brain drain. To address this, businesses are now recognising the pivotal role they must play in nurturing tech talent and digital skills in the country. One in four business leaders feel responsibility for doing so lies mainly with private enterprise; Over half (55%) plan to invest more in developing their own tech talent, with the same number pledging to address the skills gap by re-skilling older generations; and And, 51% intending to do more to re-skill people from disadvantaged backgrounds. There are issues that business needs to lead on regardless of what’s happening in the world of politics,” said Paul Smith, EVP and GM, Salesforce UK. “The economy is changing as new technologies emerge.


The Bank of Amazon: How big tech is disrupting banking


Big tech companies have already begun to embark on financial ventures, with payment platforms such as Google’s Google Wallet and Google payments, Amazon lending to SME marketplace sellers, Facebook’s partnership with Clear Bank on a product called Charged, a programme that allows financing for advertising, and Apple’s credit card, launched last year with Goldman Sachs and Marcus in the US. ... Big tech is in the position of having a significant “data advantage” over banks or fintechs, with the ability to glean more information about their users than others could hope to achieve. With the tech resources to offer an improved user experience and services that are integrated into their existing platforms, a grasp of artificial intelligence that traditional banks are only just beginning to deploy, sophisticated cloud computing, and an already loyal user base, up to 40% of the revenue currently generated by the US financial industry could move over to Big Tech, according to McKinsey.


Restoring Vision With Bionic Eyes: No Longer Science Fiction


"Brain-computer interfaces" can be used both for treating neurological and mental disorders as well as for understanding brain function, and now engineers have developed ways to manipulate these neural circuits with electrical currents, light, ultrasound, and magnetic fields. Remarkably, we can make a finger, arm, or even a leg move just by activating the right neurons in the motor cortex. Similarly, we can activate neurons in the visual cortex to make people see flashes of light. The former allows us to treat neurological conditions such as Parkinson's disease and epilepsy, whereas the latter should eventually allow us to restore vision to the blind. ... We have a real opportunity here to tap into the existing neural circuitry of the blind and augment their visual senses much like Google Glass or the Microsoft HoloLens. For example, make things appear brighter the closer they get, use computer vision to mark safe paths and combine it with GPS to give visual directions, warn users of impending dangers in their immediate surroundings, or even extend the range of "visible" light with the use of an infrared sensor. 


The Potential of AI for Utilities

Utility officials analyzing data
One of the biggest confusion factors is all the different terms that are used as synonyms for AI such as machine learning, deep learning, cognitive computing, etc. The list grows daily. Keep in mind, these terms are not interchangeable, but they are often used that way. That doesn’t help anyone trying to figure out AI or how to use it. First of all, AI is a division of computer science using complex instruction sets to perform what appears to be human-like intelligence. These programs are powered by algorithms, and that is the ingredient causing the mystique. Without going into a lot of detail, an algorithm is a set of step-by-step computer instructions that can use data to build models that make predictions based on the data. Remember, we are a long way off from the thinking, talking robots seen in movies and on television. Algorithms are how AI demonstrates being smart, but be aware it’s not intelligent, which is the critical distinction. This type of AI is referred to as Narrow AI or Applied AI. It is said to simulate human thought, but each application can only carry out one specific task with a limited range of functions.


RiskIQ uncovers new Magecart campaign


This attack introduces yet another method by Magecart that RiskIQ researchers call a “spray and pray” approach. Because skimmers work only when placed on payment or checkout pages, most Magecart attacks target specific e-commerce sites and attempt to drop a skimmer only on pages with payment forms. However, the ease of compromise that comes from finding S3 buckets misconfigured to allow public access means that even if only a fraction of their skimmer injections return payment data, it will yield a substantial return on investment, the researchers said. “This is a brand new twist on Magecart,” said Yonathan Klijnsma, head threat researcher at RiskIQ. “Although this group chose reach over targeting, they likely ended up getting their skimmer on enough payment pages to make their attack lucrative. They have done their cost-benefit analysis.” The scale of this latest attack illustrates how easy it is for threat actors of any kind to compromise a vast quantity of websites at once with scripts stored in misconfigured S3 buckets.


Stream Processing Anomaly Detection Using Yurita Framework

Working at PayPal on a next generation stream processing platform, we started to notice that many of our users wanted to use stream processing to apply anomaly detection models in real time. After we explored different architectures to create a flexible production grade framework that can scale to real world workloads, eventually we decided to go with a pipeline-based API, inspired by other open source projects like scikit-learn and Spark MLlib. This work has led to the development of Yurita - an open source anomaly detection framework for stream processing. Yurita is based on the new Spark structured streaming framework, and utilizes its processing engine capabilities to reach high scale and performant execution. The name Yurita comes from a traditional Japanese gold panning tool. ... Without knowing what the normal behavior of a metric is, we would be able to use only simple anomaly detection techniques, like rule-based decisions which also require a deep understanding of each specific dataset, and therefore are not scalable from a productivity point of view.



Quote for the day:


"All organizations are perfectly designed to get the results they are now getting. If we want different results, we must change the way we do things." -- Tom Northup


Daily Tech Digest - July 10, 2019

edge computing, Linux, Red Hat
One reason why edge computing defies easy definition is that it takes many different forms. As Jaromir Coufal, principal product manager at Red Hat, recently pointed out to me, there is no single edge. Instead, there are lots of edges – depending on what compute features are needed. He suggests that we can think of the edge as something of a continuum of capabilities with the problem being resolved determining where along that particular continuum any edge solution will rest. ... Done properly, edge computing can provide services that are both faster and more reliable. Applications running on the edge can be more resilient and run considerably faster because their required data resources are local. In addition, data can be processed or analyzed locally, often requiring only periodic transfer of results to central sites. While physical security might be lower at the edge, edge devices often implement security features that allow them to detect 1) manipulation of the device, 2) malicious software, and 3) a physical breach and wipe data.



Deep Learning for Computer Vision: A Beginners Guide

What distinguishes deep learning is that its networks contain many hidden layers. This extra complexity empowers machines to learn from unstructured, unlabeled data as well as labeled and categorized data. Note that none of these concepts are particularly new — rapid advances in computing power and technology enables the models to be fed with large volumes of data. The more data available, the more proficient the models become at learning tasks. Speech recognition, image recognition, natural language processing (NLP), and computer vision are some of the areas deep learning has improved dramatically. Many technology companies now specialize in providing platforms for training deep learning models in computer vision and other areas. Such companies have also facilitated further innovation in these artificial intelligence branches. ... The most exciting potential use for this computer vision function is real-time semantic segmentation used by self-driving cars. Identifying and localizing objects accurately can improve the safety and reliability of autonomous vehicles.


How ASEAN firms are turning data into critical assets


Rouam said SGX is now looking to grow that figure by providing new data services, such as data visualisation, to external customers, effectively extending the value of its data beyond internal use. “We have data teams that use data for internal purposes, such as analysing customer behaviour as well as meeting regulatory and operational requirements,” said Rouam. “To turn data into a critical asset, we did a lot of work on the technology and legal processes.” This includes building a centralised database with in-memory capabilities to ensure timely access to machine-readable data for data scientists. SGX has also built a logical layer to help business users understand the data, along with a data dictionary and business glossary. At SP Digital, the digital services subsidiary of Singapore utility provider SP Group, data has always been critical to its operations, enabling it to control and manage the country’s critical energy infrastructure. About three years ago, the company embarked on a digital transformation initiative, which, among other areas, includes building a data lake to house all its data, according to Chang Sau Sheong, CEO of SP Digital.



Applied AI in Software Development

When it comes to validating the property of an object present on the UI, TestComplete provides multiple options to access the object’s properties. For example, a button object can have properties like enabled or disabled, text, coordinates, Id, class, etc. Hence, it becomes easy to identify an object based on these properties and confirm the expected behavior. However, reading content on images or a graphical chart like interfaces which are becoming more common with the proliferation of business intelligence and data-driven dashboards is difficult to identify and validate or perform automated actions on them. However, this has changed with the introduction of the latest version of TestComplete viz. version 12.60 which overcomes this issue by making use of an API driven optical character recognition (OCR) service. There are certain prerequisites to use this option. First, one needs to enable the OCR plugin by installing the extension under File –> Install Extension.


Caitlin Long, Blockchain’s Ambassador of Hope

Caitlin Long, Blockchain̢۪s Ambassador of Hope: Notes from a RegTech conference keynote
What became obvious is that humans, no matter what industry or background, are inherently lazy. When it comes to understanding what’s going on in Wyoming and the fast-tracked 13 bills that were passed (thanks to the leadership and vision of Caitlin Long from 2018 to 2019), what is realized is that humans are lazy and crave convenience. When it comes to ensuring that as emerging technologies arrive and society evolves—that our policies and laws are “backward compatible”—humans are lazy.  When it comes to educating policymakers, and vice versa, policymakers truly understanding the purpose and potential of new technology, but humans seek convenience. Complacency is not good enough anymore. Laziness is no longer an excuse. All of us have to step up to the plate, engage, work in partnership with policymakers and technologists to thoughtfully craft legislation that 1) keeps consumers, the “main street moms and pops,” safe, 2) keeps the nefarious “bad guys” at bay


Shocked by your cloud provider bills? Here’s what to do about it

One of the most common reasons businesses see their cloud costs spiral out of control is that they moved to the cloud without a plan or strategy. It’s easy to buy and consume cloud, so if you don’t have a strategy governing your company’s cloud use, it’s really easy to buy and consume more than you meant to. If you don’t have a policy for cloud use, put one together now. This will help you manage the number of cloud platforms selected as well as the costs by making sure that using the cloud is an active decision rather than something your organization does by default. ... Because of the cloud’s ability to scale more or less infinitely, the problem for users is a bit like what you’d get in the summer during a heatwave: you blast the A/C for a week because you need the relief, but you’re in for a shock when the bill comes – and what can you do? You already used all that sweet, cool air. With the cloud, you can head this particular problem off by setting limits ahead of time with your cloud provider.


The direction of business intelligence is changing to forward


Data analysis has traditionally been a relatively straightforward process: Input data, generate a report, and analyze the report to glean insight. But business intelligence is changing. Traditional data analysis is looking backward. It's attempting to figure out why something happened. It's not revealing what will happen. The look of the reports has changed significantly from dot-matrix spreadsheets to eye-popping computer graphics, thanks to Tableau, Microsoft Power BI and other data visualization platforms. But they're still reactive. That won't be good enough for the next generation of BI platforms. "In the next three to five years, instead of asking questions of data, the data will start suggesting observations," said Tim Crawford, CIO strategic adviser at AVOA, a consulting firm in Rolling Hills Estates, Calif. "You'll see things that you hadn't even thought to ask in the first place. As AI comes into play, you can expect to see tools that will identify and highlight things you hadn't thought to ask in the first place.


The Importance of QA Testing for Software Development

If you only rely on internal testing by the same people who developed the software, then they may praise their own work and be reluctant to make changes. Having testers come from different backgrounds and cultures adds diversity to the testing. This is particularly important if you plan on launching your software, service or product worldwide. This is also a reason why companies should do layered QA testing at different stages of design or development. Early testing helps prevent costly mistakes and wasteful development for features that users will not want or care about. As the product develops, further testing and documentation help guide the process in the right direction: one that will satisfy market needs and consumers.  Therefore, QA testing is not just done to eliminate bugs in the end, but to make sure the correct procedures are in place. Rather than finding defects, it deals with preventing them throughout the development process.


Will IBM’s acquisition be the end of Red Hat?

Will IBM̢۪s acquisition be the end of Red Hat?
The good news is that this merger of IBM and Red Hat appears to offer each of the companies some significant benefits. IBM makes a strong move into cloud computing, and Red Hat gains a broader international footing. The other good news relates to the pace at which this acquisition occurred. Initially announced on October 28, 2018, it is now more than eight months later. It’s clear that the leadership of each company has not rushed headlong into this new relationship. Both parties to the acquisition appear to be moving ahead with trust and optimism. IBM promises to ensure Red Hat's independence and will allow it to continue to be "Red Hat" both in name and business activity. ... Will this acquisition be the end of Red Hat? That outcome is not impossible, but it seems extremely unlikely. For one thing, both companies stand to gain significantly from the other’s strong points. IBM is likely to be revitalized in ways that allow it to be more successful, and Red Hat is starting from a very strong position. While it’s a huge gamble by some measurements, I think most of us Linux enthusiasts are cautiously optimistic at worst.


Multimodal Sentiment Analysis: Addressing Key Issues and Setting Up the Baselines

The primary advantage of analyzing videos over mere text analysis, for detecting emotions and sentiment, is the surplus of behavioral cues. Videos provide multimodal data in terms of vocal and visual modalities. The vocal modulations and facial expressions in the visual data, along with text data, provide important cues to better identify true affective states of the opinion holder. Thus, a combination of text and video data helps to create a better emotion and sentiment analysis model. Recently, a number of approaches to multimodal sentiment analysis producing interesting results have been proposed.11,13 However, there are major issues that remain mostly unaddressed in this field, such as the consideration of the context in classification, effect of speaker-inclusive and speaker-exclusive scenario, the impact of each modality across datasets, and generalization ability of a multimodal sentiment classifier. Not tackling these issues has presented difficulties in the effective comparison of different multimodal sentiment analysis methods.



Quote for the day:


"When you accept a leadership role, you take on extra responsibility for your actions toward others." -- Kelley Armstrong


Daily Tech Digest - July 09, 2019

Colocation facilities buck the cloud-data-center trend

CSO > cloud computing / backups / data center / server racks / data transfer
Poole said the average capital expenditure for a stand-alone enterprise data center that is not a part of the corporate campus is $9 million. Companies are increasingly realizing that it makes sense to buy the racks of hardware but place it in someone else’s secure facility that handles the power and cooling. “It’s the same argument for doing cloud computing but at the physical-infrastructure level,” he said. Mike Satter, vice president for OceanTech, a data-center-decommissioning service provider, says enterprises should absolutely outsource data-center construction or go the colo route. Just as there are contractors who specialize in building houses, there are experts who specialize in data-center design, he said. He added that with many data-center closures there is subsequent consolidation. “For every decommissioning we do, that same company is adding to another environment somewhere else. With the new hardware out there now, the servers can do the same work in 20 racks as they did in 80 racks five years ago. That means a reduced footprint and energy cost,” he said.


The Phantom Menace in Unit Testing

This is not a rant about unit testing; unit tests are critically important elements of a robust and healthy software implementation. Instead, it is a cautionary tale about a small class of unit tests that may deceive you by seeming to provide test coverage but failing to do so. I call this class of unit tests phantom tests because they return what are, in fact, correct results but not necessarily because the system-under-test (SUT) is doing the right thing or, indeed, doing anything. In these cases, the SUT “naturally” returns the expected value, so doing (a) the correct thing, (b) something unrelated, or even (c) nothing, would still yield a passing test. If the SUT is doing (b) or (c), then it follows that the test is adding no value. Moreover, I submit that the presence of such tests is often deleterious, making you worse off than not having them because you think you have coverage when you do not. When you then go to make a change to the SUT supposedly covered by that test, and the test still passes, you might blissfully conclude that your change did not introduce any bugs to the code, so you go on your merry way to your next task. 


British Airways facing £183m GDPR fine


The ICO said its investigation found that a variety of information was compromised by poor security arrangements at the company, including log in, payment card, and travel booking details as well name and address information. Information Commissioner Elizabeth Denham said: “People’s personal data is just that – personal. When an organisation fails to protect it from loss, damage or theft it is more than an inconvenience. That’s why the law is clear – when you are entrusted with personal data you must look after it. Those that don’t will face scrutiny from my office to check they have taken appropriate steps to protect fundamental privacy rights.” The ICO said BA has cooperated with the investigation and has made improvements to its security arrangements since the breach came to light. The company now has 28 days to make representations to the ICO about the findings of its investigation and the proposed fine. Willie Walsh, chief executive of BA owners International Airlines Group, has confirmed that the airline will make representations to the ICO, according to Reuters. “We intend to take all appropriate steps to defend the airline’s position vigorously, including making any necessary appeals,” he said.



How artificial intelligence can transform the legal sector


Firms are experimenting with the use of chatbots technology to deliver basic legal advice. DoNotPay has already garnered a lot of attention for allowing users to appeal parking fines and it’s not unreasonable to expect that as the technology becomes more sophisticated, higher quality and more specific advice could be offered by a similar machine learning (ML) tool. As with all new technologies, the ultimate aim is to ensure that firms offer a higher quality and more consistent service. It is important to note that AI is not positioned to outperform the high-end tasks performed by legal professionals, but should rather be seen as a tool designed to support them by carrying out time-consuming research or administrative tasks. That’s why now is the time for forward-thinking firms to begin integrating AI into their legal services as well as their administrative procedures. The use of AI does however give rise to a few practical and ethical considerations that legal teams must be aware of. Many revolve around the sensitive data that firms would be required to store on clients in order to offer an optimal service.


Think like a criminal to beat them at their own game ⁠— Frank Abagnale Jr

“Think like a criminal to beat them at their own game” ⁠— Frank Abagnale Jr image
Crime today, of course, has a significant physical element. However, over the last 20 years there has been a criminal movement towards the digital. Cybersecurity Ventures predicts cybercrime damages will cost the world $6 trillion annually by 2021, up from $3 trillion in 2015. The attack surface area is now different, but “the one thing that never changes is that criminals are all the same,” said Mr Abagnale. “So, if you think like a criminal, it doesn’t matter what they do, you can figure out there motives and means.” Over the course of a 43-year-career in the FBI, Mr Abagnale has worked on every single data breach, including; TJX in 2007, and more recently, the Marriott and Facebook breaches. “The one thing that I’ve learnt is that every breach occurs because somebody in that company did something that they weren’t supposed to do, or somebody in that company failed to do something they were supposed to do,” he said. “It always comes down to the human element,” reiterated Mr Abagnale.


IAM market evolves, but at a cost

"We're in this spot with a lot of technical debt," Daum said, adding that State Auto is a G Suite customer and is in the cloud with AWS, but is hesitant to add on another vendor just for identity management. "We're paying a lot of money to a lot of different companies and we're trying to find a way to see which of those companies can be used for identity services. No offense to Ping Identity or Okta, but why pay them however much money if we can limit the amount of cooks in the kitchen." Emerging capabilities within IAM products intrigued Daum, but never bested ROI. "Where's the value added?" Daum said. "Everyone is talking about cloud and password-less and zero trust. Those buzzwords sound nice, but the cost to implement is still huge." Zero trust is a security architecture introduced by Forrester Research that is designed to assess threats not just from outside the network, but from within it. It uses the principle "never trust, always verify" anything trying to connect to the network to ensure it remains secure.


Don’t wait up for the open cloud

Don̢۪t wait up for the open cloud
Open clouds have been a concept since cloud computing became a thing; the reality is that we’re dealing with public companies that have to return an investment to shareholders. They operate based on gaining profitable advantages and working within their own market microcosms. They court users in their own way, pushing their own cloud services, which leads to having workloads that are not easily transported from cloud to cloud. Indeed, if the objective is “cloud native,” by definition that's going to mean lock-in. A few open cloud standards have been pushed in the past, and currently as well. Although they found traction as private clouds, with some public cloud instances as well, private clouds have declined relative to public clouds, and the public cloud instances shut down. It’s just too hard to keep up with the larger public cloud players and their billion-dollar R&D and marketing budgets. This leads me to a few conclusions about the state of cloud computing now, as well as some projections of where things are likely to go: The notion of interoperable public clouds is not likely to happen unless the user bases demand it and the public cloud providers feel the pinch.


Do and don'ts of navigating data analytics in the cloud

The marketing hype on the cloud positioning it as being an “easy button” can draw you in, but the reality is moving an enterprise data warehouse or another type of analytical environment to the cloud is just like moving one database platform to another – and it comes with the same challenges. You and your team need to be ready to migrate, monitor and test the new environment, and when you are migrating systems that have developed over time, “lifting-and-shifting” does not come without running into technology issues or making functional decisions that impact how a business or application is run. It’s true that with the cloud, you never have to complete low-level administration of your environment such as software updates and server sizing. However, higher-level administration such as database performance, usage analysis, cost management, and security and privacy management will always be a requirement. 


Must-have features in a modern network security architecture

Modern network security must have these features
As the old security adage goes, “the network doesn’t lie.” Since all cyber attacks use network communications as part of their kill chain, security analysts must have access to end-to-end network traffic analysis (NTA) up and down all layers of the OSI stack. The best NTA tools will supplement basic traffic monitoring with detection rules, heuristics, scripting languages, and machine learning that can help analysts detect unknown threats and map malicious activities into the MITRE ATT&CK framework. ... Network security technologies must support granular policies and rules, subject to immediate alteration based upon changes in things such as user location, network configuration, or newly discovered threats/vulnerabilities. Organizations must have the ability to spin up/spin down or change network security services whenever and wherever they are needed. Modern network security controls must be able to accommodate internet of things (IoT) devices and protocols with the same types of strong policies and enforcement as they offer for standard operating systems. Finally, network security architectures must be built around easily accessed APIs for rapid integration.


A Simplified Value Stream Map for Uncovering Waste

There are a number of ways to display waste in a system. The most common approach is probably the use of value stream maps. These are maps that show the journey of a product from raw material to finished goods delivered to customers. They are very helpful in understanding the flow of goods and pinpointing wasteful delays. These don’t always seem relevant to software engineering because the images of factories, trucks, and forklifts don’t apply. Even the versions developed specifically for software sometimes seem to lack the qualities of being simple and definitive. What if we just want to know one thing: for any given process, how much time is spent waiting versus working? This would give us a simplified view of waste for any process and would be helpful in making it more efficient. The details for constructing this are straightforward. Let’s define working as time spent actively creating a product, time for which customers would gladly pay. Let’s define waiting as time spent waiting on something, time for which customers would not want to pay. We use duration (not effort) for both and we maintain consistent time units between them.



Quote for the day:


"The one nearest to the enemy is the real leader." -- Ugandan Proverb