Daily Tech Digest - November 02, 2019

Creating an agile mind-set at PepsiCo


Employees are expected to continuously learn new skills. They are expected to question practices and reduce or eliminate habits that are no longer useful. Time is allotted each week for every employee to “upgrade” themselves, and a large catalog of training materials and classes is available. Underpinning this effort is a belief that time is the most democratic and precious resource and that people can make much better use of it to be more productive at work and have more time outside of work. That is why the company has found ways to give employees back some time to innovate and better serve customers during working hours. Then, with work at the office streamlined and a culture that encourages disconnecting from the workplace during off-hours, employees no longer feel like they have to take time away from family during evenings and vacations to address work issues. Employees are encouraged to look carefully at how they spend their time in the office.



Tim Cook thinks Apple customers are rich and very sensitive

Now, though, there appears to be a new divide. Those who have one pair of AirPods and those who have two. For particular occasions, that is. This week's Apple earnings call happened to coincide with the launch of the AirPods Pro -- elevated, noise-canceling versions of Apple's earrings-gone-wrong buds. CEO Tim Cook was moved to discuss these new apparitions and who would buy them. He offered: "We're anxious to see the customers for the new AirPod Pro. But I would guess that one, particularly in the early going, will be people that have AirPods today and want to also have a pair for the times they need noise cancellation." Please forgive me if I'm anxious to immerse myself in a vat of cooling coconut balm and hum my calming meditations. Apple's CEO believes his customers are so wealthy and so very sensitive that they will take time to consider: "Hmm, is this a moment when I want to shut the world out? Or would my central nervous system prefer to hear a few tinges of intonation from the world outside?"


What goes into a user story vs. use case for Agile development?


A user story provides a short descriptive sentence that outlines the who, what and why of one or a set of software requirements. User stories put context around interactions, which enables developers to focus their efforts on perspectives, features, functionality and results. ... User stories are not ideal for every software development discussion. While user stories are quick and simple, they are often devoid of technical detail; that leaves developers with no discussion of how to accomplish a task. There is no assessment of relative difficulty, accounting for resources like developer hours, or prioritization of one user story vs. another. Project managers often make these assessments during the planning phase of each iteration. ... Use cases generally provide more detail and a deeper understanding of functional behaviors to contextualize a software requirement. Use cases help development teams define or discuss user interface designs, database access or query processes, and API communications. Group use cases together to organize them for complex projects.


Google CEO Sundar Pichai on achieving quantum supremacy

You would need to build a fault-tolerant quantum computer with more qubits so that you can generalize it better, execute it for longer periods of time, and hence be able to run more complex algorithms. But you know, if in any field you have a breakthrough, you start somewhere. To borrow an analogy—the Wright brothers. The first plane flew only for 12 seconds, and so there is no practical application of that. But it showed the possibility that a plane could fly. ... Google wouldn’t be here today if it weren’t for the evolution we have seen in computing over the years. Moore’s Law has allowed us to scale up our computational capacity to serve billions of users across many products at scale. So at heart, we view ourselves as a deep computer science company. Moore’s Law is, depending on how you think about it, at the end of its cycle. Quantum computing is one of the many components by which we will continue to make progress in computing. The other reason we’re excited is—take a simple molecule. Caffeine has 243 states or something like that.


Wireless noise protocol can extend IoT range

Internet of Things (IoT) / security alert / wireless network management
The on-off noise power communication (ONPC) protocol, as it’s called, works via a software hack on commodity Wi-Fi access points. Through software, part of the transmitter is converted to an RF power source, and then elements in the receiver are turned into a power measuring device. Noise energy, created by the power source is encoded, emitted and picked up by the measuring setup at the other end. “If the access point, [or] router hears this code, it says, ‘OK, I know the sensor is still alive and trying to reach me, it’s just out of range,’” Neal Patwari of Washington University says in a Brigham Young University (BYU) press release. “It’s basically sending one bit of information that says it’s alive.” The noise channel is much leaner than the Wi-Fi one, BYU explains. “While Wi-Fi requires speeds of at least one megabit per second to maintain a signal, ONPC can maintain a signal on as low as one bit per second—one millionth of the data speed required by Wi-Fi.” That’s enough for IoT sensor housekeeping, conceivably. Additionally, “one bit of information is sufficient for many Wi-Fi enabled devices that simply need an on [and] off message,” the school says.


MIT-IBM Watson AI Lab: Robots will take over parts of your job, not all of it


Casey said business leaders and government officials should pay attention to recommendations from another researcher studying automation, Carl Benedikt Frey. Frey made an initial prediction about 47% of jobs being at high risk for automation and is quoted in the MIT-IBM research. As Frey stated in his initial automation research, business process and technology investment, regulatory concerns, political pressure, and social resistance will determine how automation affects jobs and wages. Frey's latest thinking is that the true concern is not about automation in general but that the revolution won't go far enough. The incomplete technology transformation will trap workers in a permanently unequal income distribution. If businesses only go so far toward automation, the full productivity benefit will not be realized. Casey said that the goal is to get to a point in the machine-learning revolution at which technology is creating new tasks and jobs for people to do. "What they're worried about is we'll get stuck at a place where there's nothing that could be transformative enough to create new tasks and create new jobs," he said. "What we want is sufficiently transformative tech that raises productivity enough so that new tasks emerge."


Google agrees to buy Fitbit in $2.1B deal to help boost Wear OS


While Fitbit's software is "solid ... it will be interesting to see how long Google keeps Fitbit separate or if it tries to integrate its apps into Android," Greengart adds. He notes while Google's promise on user data "is promising," the company's "users will have to trust that it stays that way." Alphabet released a sluggish financial report Monday, with $40.49 billion in sales, exceeding analysts' estimate of $40.32 billion, and earnings per share of $10.12, below the expected $12.42 per share. We can expect to see Fitbit's third-quarter earnings report on Nov. 6, the company said last month. Putting a dampener on the news for Google, however, House Antitrust Subcommittee Chair David Cicilline later Friday said the acquisition announcement has triggered more antitrust concerns as the tech giant's "dominance" is already being investigated. "By attempting this deal at this moment, Google is signaling that it will continue to flex and expand its power in spite of this immense scrutiny," Cicilline said in a statement. The acquisition would also give Google "deep insights into Americans' most sensitive information," including health and location data, according to Cicilline.


New 'unremovable' xHelper malware has infected 45,000 Android devices

Android malware botnet
Named xHelper, this malware was first spotted back in March but slowly expanded to infect more than 32,000 devices by August (per Malwarebytes), eventually reaching a total of 45,000 infections this month (per Symantec). The malware is on a clear upward trajectory. Symantec says the xHelper crew is making on average 131 new victims per day and around 2,400 new victims per month. Most of these infections have been spotted in India, the US, and Russia. According to Malwarebytes, the source of these infections is "web redirects" that send users to web pages hosting Android apps. These sites instruct users on how to side-load unofficial Android apps from outside the Play Store. Code hidden in these apps downloads the xHelper trojan. The good news is that the trojan doesn't carry out destructive operations. According to both Malwarebytes and Symantec, for most of its operational lifespan, the trojan has shown intrusive popup ads and notification spam. The ads and notifications redirect users to the Play Store, where victims are asked to install other apps -- a means through which the xHelper gang is making money from pay-per-install commissions.


The changing role of the enterprise architect


The need to focus on operational efficiency has diminished the EA’s role as a pan-organisation technology strategist. To address this and the needs of modern organisations, the current singular EA role must be devolved into its three component parts, eliminating the constraints it has experienced over the past decades, which have limited its strategic value to the organisation. These separate roles – strategist, engineer and custodian – must also reside and operate permanently in corporate strategy, the programme office, and the IT department, respectively. So how do we broadly define these roles? The Strategist role acts as a positive change agent, assessing outlier and newly adopted technologies to propose how their use can serve the corporate leadership’s vision, at the start of a business strategy’s development.... The technology engineer role is responsible for creating project technology designs that fit the business strategy. From the point of drafting to final design, the engineer consults with both the strategist and custodian roles.


Implement Agile IT Strategic Planning With Enterprise Architecture


In this modern age, Digital Transformation continues to be a priority for company executives. They know that Artificial Intelligence (AI), Blockchain, Internet of Things (IOT), and Big Data are driving their ability to improve customer experience, stay ahead of the competition and generate business growth. However, with IT teams entrenched in managing day-to-day technology, it is difficult for IT to stay abreast of the strategic discussions occurring at the business level and proactively plan for associated IT upgrades, modifications, or new systems. This disconnect can result in a lagging approach to IT planning especially as business decisions are made in fast-moving agile environments. To remedy this, companies need a holistic approach that connects business and technology. Enterprise Architecture (EA) is the key to this foundation as it helps companies improve their IT Strategic Planning by helping companies precisely see and understand how IT systems support business objectives. An IT Roadmap that is built on foundational Enterprise Architecture yet designed to realize business outcomes enables a company to assess the impact of change on the existing IT landscape and therefore quickly adjust as needed.



Quote for the day:


"Teamwork is the secret that make common people achieve uncommon result." -- Ifeanyi Enoch Onuoha


Daily Tech Digest - November 01, 2019

Use Chrome? Update Your Browser Immediately


Users of Google's Chrome web browser are being urged to install the latest update immediately to patch two security vulnerabilities, one of which is already being exploited in the wild. As the National Cyber Security website reports, the two high severity vulnerabilities are known as CVE-2019-13720 and CVE-2019-13721 and classed as "use-after-free" vulnerabilities. That means they allow for data in memory to be corrupted by a remote hacker and then the execution of arbitrary code allowed. In other words, they allow for a PC to be hijacked. One of the vulnerabilities is to do with Chrome's audio component, while the other is for the PDFium library, which Chrome uses for PDF document generation and rendering. Kaspersky researchers Anton Ivanov and Alexey Kulaev have already detected the audio component compromise being used in the wild, hence the urgency for users to update. The latest version of Chrome released today to fix the security vulnerabilities is version 78.0.3904.87 and it's available for Windows, Mac, and Linux.



How powerful people slip


Studies have found, for example, that high levels of relative power often correspond with increased neural activity in the brain’s behavioral activation system (BAS). BAS is a pattern of neural circuits posited by psychologist Jeffrey Alan Gray in 1970 as an explanation for how the brain processes the experience of rapid reward. Nestled deep in the brain, these circuits include the basal ganglia and parts of the prefrontal cortex. They have been known to release the neurotransmitter dopamine, associated with pleasure. If you are a leader, the increase in BAS activity produced by the power of your role can make you more effective in noticeable ways — specifically, by increasing your attention to goal-relevant information, your comfort with innovation and risk taking, and your ability to think at a visionary level. Gray and subsequent psychologists have also posited that when the BAS is engaged, another system, called the behavioral inhibition system (BIS), tends to be more idle. The BIS, generally associated with in the brain’s septohippocampal system, is associated with feelings of anxiety, sensitivity to punishment, frustration, and risk aversion.


Why HR professionals need to adapt to new technology


HR technologies are being invested in, now more than ever, by a myriad of businesses. The 2019 HR Technology Market Report outlines some key findings on this front – investments into HR technology have increased by 29 per cent, resulting in the market for HR technologies growing by a noteworthy 10 per cent. Also highlighted were new trends towards artificial intelligence, a shift away from engagement towards productivity in core systems, and the recognition of the role the gig workforce plays. Artificial intelligence is far more than a buzzword designed to impress the board of directors when it comes to HR. Unilever offers a prime example of this – given that it recruits upwards of 30,000 individuals a year, it should come as no surprise that a significant amount of capital and manpower must be devoted to sifting through applications to identify the best people for the job. This changed dramatically with its AI-powered solution: partnering with Pymetrics, the business developed a platform that would test the candidate’s aptitude, and even process 30-minute interview videos, using natural language processing and body language analysis to assess their suitability for a given role.


As devices generate more data, AI is becoming indispensable for medtech


While technology companies often have sophisticated AI capabilities, medtech companies have deep expertise in the clinical development of medical algorithms, such as translating data from an EKG lead into meaningful output that a physician can use. This clinical expertise and credibility with physicians could be useful to potential con­sumer tech partners. Moreover, consumer technology companies’ data sci­ence and AI expertise— combined with medtech’s ability to develop meaningful medical applications and algorithms—could lead to powerful offerings that will improve patient health. ... Regulators are working to develop regulatory guardrails as AI applications take off in medtech. Earlier this month, the US Food and Drug Administration (FDA) released a draft framework detailing the types of AI/machine learning-based algorithm changes in medical devices that might be exempt from pre-market submission requirements.9 As part of the Consumer Technology Association’s AI initiative, AdvaMed, Google, Doctor On Demand and other organizations will work to develop standards and best practices for AI use cases in medicine and health.


How 5G Will Drive The Future Of Industry 4.0


5G can also assist manufacturers in optimising their operations by using IoT sensors to monitor the performance of equipment and workers so improvements in working processes can be identified. In fact, research from IDC found that IoT technology can boost productivity in the supply chain by 15%. Utilising IoT-based monitoring can also enable predictive maintenance, reducing overall maintenance costs by up to 30%, says Accenture. What’s more, the incredibly low latency offered by next-generation connectivity can enable remote operation of equipment. This enables automation of machinery and the use of untethered robots, helping to make factories safer. 5G infrastructure can also help unlock actionable insights from the vast amounts of data generated by the ever-growing number of connected devices. Data analytics can bring operational efficiencies and cost savings while logistics can also be enhanced with real-time tracking data. Many manufacturing businesses will make use of private, on-premise 5G networks.


Agile and late! End-to-end Delivery Metrics to Improve your Predictability


The key delivery metrics require surfacing data from a myriad of sources including; work-flow management tools, code repos, and CI/CD tools – as well as collecting quant feedback from the engineering team themselves (via collaboration hubs). The complexity of the data and multiple sources make this sort of data collection very time consuming to do manually and really requires an end-to-end delivery metrics platform to do at scale. Delivery metrics platforms are available which consist of a data layer to collate and compile metrics from multiple data sources and a flexible UI layer to enable the creation of custom dashboards to surface the metrics in the desired format. ... Done well, Root Cause RAG Reports can be a really effective means of presenting our (more accurate) forecasts in a way that stakeholders can understand and therefore can be an important step in reducing lateness and bringing the technology team and the internal client much closer together.  As discussed however, it relies on an understanding of the metrics that actually determine project lateness and a means of collecting those metrics.


Open source technology, enabling innovation


Open source allows people to collaborate and promotes a meritocracy of ideas In doing so, Kubernetes helps companies harness processing power and run their software more efficiently no matter how many machines they have and no matter how many competing cloud services they’re using. This is especially useful for companies without a refined IT service as it makes managing commercial software cloud servers much less of a headache. These abilities are all underpinned by open source code so it enables a company to build a system tailored to their needs, which will evolve as it becomes more successful and expands its operations. Originally open sourced by Google in 2014, Kubernetes has remained relevant technology because of the open source community that supports it and it’s consistently one of the top projects on GitHub, the open source cloud server used by developers to store and manage code. Twitter, Huawei, Intel, Cisco and IBM are just some of the businesses that have been involved in its development over the years thanks to the fact that Google donated it to the Cloud Native Computing Foundation, a collective of open source development advocates.


10 tips for effective change management in Agile

Agile software development cycle.
Non-Agile methodologies make an implicit assumption that requirements are final and that a change management process can accommodate only minor variations in them. Design requirements, also called acceptance criteria, are subject to constant, planned change in Agile iterations. Agile enables product managers to demonstrate working software and elicit customer feedback. If the user needs aren't met, the product owner and developers make change requests to the application code, and possibly alter the delivery schedule. Thus, change management is an inherent part of the Agile software development process. The ability to demo working applications means you can design for customer expectations. Rather than create and develop an application workflow based on only written requirements or feature descriptions, keep the customer informed of the application and its functionality. If a development team spends six months working on an app and delivers it on time to the customer, that's a good thing -- as long as that application aligns with the customer's expectations. If it doesn't meet user needs, the delivery is not successful. Keep the customer in the loop and manage requirement changes accordingly for long-term application success.


Big Four carriers want to rule IoT by simplifying it

IoT | Internet of Things  >  A web of connected devices.
The carriers’ approach to the IoT market is two-pronged, in that they sell connectivity services directly to end-users as well as selling connectivity wholesale to device makers. For example, one customer might buy a bunch of sensors directly from Verizon, while another might buy equipment from a specialist manufacturer that contracts with Verizon to provide connectivity. There are, experts agree, numerous advantages to simply handing off the wireless networking of an IoT project to a major carrier. Licensed networks are largely free of interference – the carriers own the exclusive rights to the RF spectrum being used in a designated area, so no one else is allowed to use it without risking the wrath of the FCC. In contrast, a company using unlicensed technologies like Wi-Fi might be competing for the same spectrum area with half a dozen other organizations. It’s also better-secured than most unlicensed technologies or at least easier to secure, according to former chair of the IEEE’s IoT smart cities working group Shawn Chandler. Buying connectivity services that will have to be managed and secured in-house can be a lot more work than letting one of the carriers take care of it.


Should you go all-in on cloud native?

Should you go all-in on cloud native?
The second school of thought is that we might add too much complexity by going all-in native. Although there are advantages, moving to Kubernetes-native systems means having at least two of everything. Enterprises moving to Kubernetes-driven, container-based applications are looking for a common database system, one that spans applications inside and outside Kubernetes. Same with security, raw storage, and other systems that may be native to the cloud, but not Kubernetes. What’s the correct path? One of the lessons I’ve learned over the years is that best-of-breed and fit-to-purpose technology is typically the right way to go. This means native everything and all-in native, but you still need to be smart about picking solutions that will work longer term, native or not. Will there be more complexity? Of course, but this is really the least of your worries, considering the movement to multiclouds and IoT-based applications. Things will get complex out there no matter if you’re using a native Kubernetes solution or not. We might as well get good at complexity, and do things right the first time.



Quote for the day:


"Real leaders are ordinary people with extraordinary determinations." -- John Seaman Garns


Daily Tech Digest - October 31, 2019

What the Google vs. IBM debate over quantum supremacy means

img-0066.jpg
The debate is over what it means when you run an actual quantum computer, such as Sycamore, and compare it to a simulation of that quantum computer inside of a classical, electronic computer. Quantum simulation software, such as Microsoft's LIQUi|⟩ program, allows a traditional computer to represent a quantum computer in ordinary circuitry, by translating quantum mechanics into mathematical structures, known as matrices of complex numbers (numbers that incorporate both real and imaginary numbers). With simulations, it's possible to compare how long it takes real quantum circuits to produce a given computation, and how long the same computation takes a classical computer to reproduce, by running the matrix math that resembles the functions of the quantum circuit.  Google and IBM are both looking at such simulations, and they're taking different views as to what the comparison means.  Google's point is that Sycamore is a device that does the work it takes millions of conventional processors to simulate. 


Why organizations feel vulnerable to insider attacks


A full 43% of the respondents cited phishing attacks that trick employees into sharing sensitive company information. Some 24% pointed to weak passwords, 15% referred to spear-phishing attacks targeted to specific individuals, and 15% cited orphaned accounts. Data leakage or theft is always a concern for security professionals both from outside and inside the company. Asked which type of data is most vulnerable to insider attacks, 63% of the respondents pointed to customer data, 55% to intellectual property, and 52% to financial data. ... Insider attacks pose enough of a concern that most organizations do have certain tools in place to deal with them. Some 68% of those surveyed said they feel anywhere from moderately to extremely vulnerable to insider attacks. While 49% said they feel they have the right controls to prevent an insider attack, 28% said they do not, and 23% said they were not sure. Most of the respondents use some type of analytics to determine insider threats with 32% relying on activity management and summary reports, 29% on user behavior analytics, 28% on data access and movement analytics, and 14% on predictive analytics.


North Korean malware detected in India's Kudankulam nuclear facility

North Korean malware detected in India's Kudankulam nuclear facility
The Nuclear Power Corporation of India (NPCIL) admitted yesterday that one of the computers at its Kudankulam nuclear power plant (KKNPP) had been attacked by malware. The malware, however, did not affect the critical internal network of the plant, NPCIL claimed, but the company only confirmed the attack following strong denials.  "Identification of malware in NPCIL system is correct," A.K. Nema, Associate Director and Appellate Authority, NPCIL, belated admitted in a statement. "The matter was conveyed by CERT-In [Indian Computer Emergency Response Team] when it was noticed by them on 4 September 2019," he added. According to Nema, the matter was investigated by DAE cyber security specialists, who found that the compromised computer was connected to the internet and was being used only for administrative work only. He also added the virus infection was isolated from the critical internal network of the plant. A day earlier, KKNPP senior official R Ramdoss had rejected social media reports, which claimed that domain controller-level access at KKNPP has been compromised.


Four principles for security metrics

Metrics come into their own when they act first as a tool to help people understand what’s going on, what you need to do to improve, then how to track progress and measure success. Start by creating one metric per process – then, if this metric goes out of tolerance, you’ll have a clear idea of how to address this. For example, “number of high severity vulnerabilities” is not easily actionable. If this goes beyond your tolerance what action do you take? It’s affected by multiple processes as well as circumstances beyond your control. As metrics are developed and you get new views on what your data is telling you, you’ll likely uncover things that are slipping between the cracks of existing processes. If you have a metric that tracks the current status of a given process, but you require several projects to deal with legacy issues that existed before the process was created/updated, there is no harm in splitting that historical data out and tracking those remediation projects separately. This will help you avoid the situation where your metric confuses good current performance with past problems that are now under management via a different – and possibly longer term – process.


The 10+ Most Important Job Skills Every Company Will Be Looking For In 2020


There's no shortage of information and data, but individuals with the ability to discern what information is trustworthy among the abundant mix of misinformation such as fakes news, deep fakes, propaganda, and more will be critical to an organization's success. Critical thinking doesn’t imply being negative; it’s about being able to objectively evaluate information and how it should be used or even if it should be trusted by an organization. Employees who are open-minded, yet able to judge the quality of information inundating us will be valued. ... Technical skills will be required by employees doing just about every job since digital tools will be commonplace as the 4th industrial revolution impacts every industry. Artificial intelligence, Internet of Things, virtual and augmented reality, robotics, blockchain, and more will become a part of every worker's everyday experience, whether the workplace is a factory or law firm. So, not only do people need to be comfortable around these tools, they will need to develop skills to work with them.


How schools can better protect themselves against cyberattacks


Schools often are more vulnerable to cyberattacks in comparison with larger companies and enterprises, and for a variety of reasons. Many school districts may have only one or two IT people to serve the entire district, so the staffers are spread thin. Budget constraints have affected many schools, limiting the amount of money they can spend on security solutions. Most schools likely have the necessary security set up on individual computers and even the overall network. But comprehensive perimeter protection may not be in place, potentially leading to data breaches and malware hosted on the school's website. Young students don't necessarily have the skills or training to adequately identify phishing emails and other threats, so such attacks are often more successful. The number of tablets and other devices issued by schools has increased in recent years and because of that, students may use those devices on outside networks that aren't secure, thereby raising the risk of infection. Even in the face of budget constraints and other limitations, schools should have adequate security measures in place to protect themselves, their data, and their students from security threats.


AI’s ‘most wanted’: Which skills are adopters most urgently seeking?


When we compare companies with relatively little AI experience (they’ve built five or fewer production systems) with those possessing extensive AI experience (they’ve built 20 or more production systems), we observe an interesting shift in “most wanted” roles (see chart). Early on, AI researchers are the most sought-after, with about a third of the less-experienced rating them a top-two needed role. Business leaders rank near the bottom. By the time adopters have become highly experienced at building AI solutions, business leaders have bubbled to the top, and AI researchers have sunk almost to the bottom. What can we make of this curious flip? Many companies embarking on AI initiatives may feel they need to hire AI superstars—researchers with advanced degrees who can invent new AI algorithms and techniques—to spearhead their efforts.2 By the time organizations have amassed a lot of AI experience, they may have filled their ranks with enough of these brilliant experts. At that stage, they’re eager to find business leaders who can play the crucial “translator” role: figuring out what the results from AI systems mean, and how they should factor into business decisions and actions.


The Role of CIO and How It is Being Rewritten


Digitization spurs new priorities – alongside a full slate of historical departmental responsibilities. In enterprise tech speak, leverage simply means using tools, systems or techniques to convert relatively small effort into significantly greater output. Digital transformation won't fit nicely alongside traditional management processes or cleanly under one leader’s org chart. IT leaders have to get more. More out of themselves, their teams and their dollars to succeed in the new enterprise era. One industry survey shares that at least 84% of top CIOs are now responsible for areas outside of traditional IT. The most common areas are innovation and transformation. Further research reveals that 95% of CIOs expect digitization to change or remix their job. Regardless of where or how new responsibilities intersect business and technology, IT will have a key role to play. Yet, without suitable systems leverage, the CIO position is challenging. How else can they handle the burden of IT consumerization, mobile workforces, big-data challenges, shadow IT and cost management.



AI capabilities power global IoT adoption

Story image
AIoT is defined as decision making aided by AI technologies in conjunction with connected IoT sensor, system or product data. AI technologies include deep learning, machine learning, natural language processing, voice recognition and image analysis. According to the survey, 34% of respondents said increasing revenue is the top goal for using AIoT. Improving the ability to innovate (17.5%), offering customers new digital services (14.3%) and decreasing operational costs (11.1%) were all key goals. Intel Americas chief data scientist Melvin Greer says AI and IoT are no longer separate technologies. “AI closes the loop in an IoT environment where IoT devices gather or create data, and AI helps automate important choices and actions based on that data,” explains Greer. “Today, most organisations using IoT are only at the first ‘visibility’ phase where they can start to see what’s going on through IoT assets. But they’re moving toward the reliability, efficiency and production phases, which are more sophisticated and require stronger AI capabilities.”


Defense Innovation Board unveils AI ethics principles for the Pentagon

U.S. Pentagon
Applied Inventions cofounder and computer theorist Danny Hillis and board members agreed to amend the draft document to say the governable principle should include “avoid unintended harm and disruption and for human disengagement of deployed systems.” The report, Hillis said, should be explicit and unambiguous that AI systems used by the military should come with an off switch for a human to press in case things go wrong. “I think this was the most problematical aspect about them because they’re capable of exhibiting and evolving forms of behavior that are very difficult for the designer to predict, and sometimes those forms of behavior are actually kind of self preserving forms of behavior that can get a bit out of sync with the intent and goals of the designer, and so I think that’s one of the most dangerous potential aspects about them,” he said. The Defense Innovation Board is chaired by former Google CEO Eric Schmidt, and members include MIT CSAIL director Daniela Rus, Hayden Planetarium director Neil deGrasse Tyson, LinkedIn cofounder Reid Hoffman, Code for America director Jennifer Pahlka, and Aspen Institute director Walter Isaacson.



Quote for the day:


"It's not about how smart you are--it's about capturing minds." -- Richie Norton


Daily Tech Digest - October 30, 2019

Automation projects: A good time to switch vendors?

automation iot machine learning process ai artificial intelligence by zapp2photo getty
Many network infrastructure vendors are developing automation technology aimed primarily, if not solely, at their own products, rather than multi-vendor environments. While most enterprises use two or three different automation tools in their initiatives, 42 percent say that an automation tool aimed at a single vendor is part of their strategy. In fact, 26 percent said a single-vendor automation tool is the most important part of their automation technology strategy. ... The most important ZTP feature, according EMA’s survey, is software-image auto-updates and verifications. Many enterprises are also interested in being able to custom provision and configure devices via scripts and the ability to unify ZTP network provisioning with compute and storage infrastructure in data centers. Not every network vendor offers embedded ZTP features on their platforms, and most only offer them on their latest generation products. Enterprises with older equipment may switch to a new vendor during a refresh, and ZTP features may be a contributing or leading driver of that vendor switch.


Joker's Stash Lists 1.3 Million Stolen Indian Payment Cards

Group-IB, which has analyzed the cards listed for sale, says more than 98 percent appear to have been issued by Indian banks, with a single bank accounting for more than 18 percent of all of the dumps. About 1 percent of the cards appear to have been issued to Columbian banks. What's unusual about this sale is that so many payment cards have been uploaded at once. "Databases are usually uploaded in several smaller parts at different times," says Ilya Sachkov, CEO and founder of Group-IB, which was originally headquartered in Moscow. While that is unusual, so too is the sheer scale of what's being offered all at once. "This is indeed the biggest card database encapsulated in a single file ever uploaded on underground markets at once," he says. "What is also interesting about this particular case is that the database that went on sale hadn't been promoted prior either in the news, on card shop or even on forums on the dark net. The cards from this region are very rare on underground markets. In the past 12 months, it is the only one big sale of card dumps related to Indian banks."


Kubernetes vs. Docker: Understand containers and orchestration
Containers are designed chiefly to isolate processes or applications from each other and the underlying system. Creating and deploying individual containers is easy. But what if you want to assemble multiple containers—say, a database, a web front-end, a computational back-end—into a large application that can be managed as a unit, without having to worry about deploying, connecting, managing, and scaling each of those containers separately? You need a way to orchestrate all of the parts into a functional whole. That’s the job Kubernetes takes on. If containers are passengers on a cruise, Kubernetes is the cruise director. Kubernetes, based on projects created at Google, provides a way to automate the deployment and management of multi-container applications across multiple hosts, without having to manage each container directly. The developer describes the layout of the application across multiple containers, including details like how each container uses networking and storage. Kubernetes handles the rest at runtime. It also handles the management of fiddly details like secrets and app configurations.


The effect of having computer systems wirelessly or directly transmit data to the brain isn't known, but related technologies such as deep brain stimulation -- where electrical impulses are sent into brain tissue to regulate unwanted movement in medical conditions such as dystonias and Parkinson's disease -- may cause personality changes in users.  And even if BCIs did cause personality changes, would that really be a good enough reason to withhold them from someone who needs one -- a person with paraplegia who requires an assistive device, for example? As one research paper in the journal BMC Medical Ethics puts it: "the debate is not so much over whether BCI will cause identity changes, but over whether those changes in personal identity are a problem that should impact technological development or access to BCI". Whether regular long-term use of BCIs will ultimately effect users' moods or personalities isn't known, but it's hard not to imagine that technology that plugs the brain into an AI or internet-level repository of data won't ultimately have an effect on personhood.


With communication, previous attempts used infrared lights or radio waves, but if you have many robots in a small area, these signals can conflict. The MIT team instead created a cube devoid of arms, using inertial forces to move the robots. These forces are the result of a mass inside each cube that throw themselves against the side of the module, causing the block to rotate or move in 24 different directions, with there being six faces, the paper added.  "There's a relatively large field of other people building sort of similar robots," Romanishin said, "But the two main unique parts about our robots are how they move, which is using angular momentum from what we call a reaction wheel, and the way it uses magnets. It uses them in a special way that is potentially a really scalable and cheap solution for identifying hundreds of thousands of elements in a small space." "One of the big things that we looked at was how do you make the robots move relative to each other? It's a really challenging, from a design standpoint and a physics standpoint," Romanishin added. 


Regression testing process
In simple terms, regression testing can be defined as retesting a computer program after some changes are made to it to ensure that the changes do not adversely affect the existing code. Regression testing increases the chance of detecting bugs caused by changes to the application. It can help catch defects early and thus reduce the cost to resolve them. Regression testing ensures the proper functioning of the software so that the best version of the product is released to the market. However, creating and maintaining a near-infinite set of regression tests is not feasible at all. This is why enterprises are focusing on automating most regression tests to save time and effort. ... Whenever there is a change in the app or a new version is released, the developer carries out these tests as a part of the regression testing process. First, the developer executes unit-level regression tests to validate the code that they have modified along with any new test that is created to cover any new functionality. Then the changed code is merged and integrated to create a new build of AUT. After that, smoke tests are performed to assure that the build that we have created in the previous step is good before any additional testing is performed.


Object storage in the cloud: Is backup needed?

CSO > cloud computing / backups / data center / server racks / data transfer
How the replication works is also very different. Object replication is done at the object level vs the block-level replication of cloud block storage and typical RAID systems. Objects are also never modified. If an object needs to be modified it is just stored as a new object. If versioning is enabled, the previous version of the object is saved for historical purposes. If not, the previous version is simply deleted. This is very different from block storage, where files or blocks are edited in place, and the previous versions are never saved unless you use some kind of additional protection system. Cloud vendors offer object-storage services, which include Amazon's Simple Storage Service (S3), Azure’s Blob Store, and Google’s Cloud Storage. These object-storage systems can be set up to withstand even a regional disaster that would take out all availability zones. Amazon does this using cross-region replication that must be configured by the customer. Microsoft geo-redundant storage includes replication across regions, and Google offers dual-region and multi-region storage that does the same thing.


Massive Cyberattack Slams Country of Georgia

Massive Cyberattack Slams Country of Georgia
One obvious potential culprit for the attacks against Georgia would, of course, be Russia, which has previously launched politically motivated cyberattacks against the government sectors of former Soviet states, including Estonia. Georgia is a U.S. ally, and since 2011, it has been an "aspirant country" in terms of its potential membership in NATO. It's also been engaged in a months-long spat with Moscow. After a Russian legislator's address to the Georgian parliament triggered protests, Georgia on June 20 temporarily blocked all flights originating from Russia. In response, Russian President Vladimir Putin on June 21 ordered that starting July 8, Russian carriers were barred from operating flights between Russia and Georgia. The Monday cyberattack against Georgia echoes cyberattacks launched against the country in 2008, weeks before the country was invaded by Russia over Georgia's "breakaway provinces" of South Ossetia and Abkhazia. At the time, Moscow said it wasn't responsible for the cyberattacks, but it suggested that some Russian individuals may have been independently involved.


Lies programmers tell themselves

9 lies programmers tell themselves
Figuring out how to handle null pointers is a big problem for modern language design. Sometimes I think that half of the Java code I write is checking to see whether a pointer is null. The clever way some languages use a question mark to check for nullity helps, but it doesn’t get rid of the issue. A number of modern languages have tried to eliminate the null testing problem by eliminating null altogether. If every variable must be initialized, there can never be a null. No more null testing. Problem solved. Time for lunch. The joy of this discovery fades within several lines of new code because data structures often have holes without information. People leave lines on a form blank. Sometimes the data isn’t available yet. Then you need some predicate to decide whether an element is empty. If the element is a string, you can test whether the length is zero. If you work long and hard enough with the type definitions, you can usually come up with something logically sound for the particular problem, at least until someone amends the specs. After doing this a few times, you start wishing for one, simple word that means an empty variable.


Categorise Unsolved Problems in Agile Development: Premature & Foreseeable

Unsolved problems belong on the backlog. In theory, the Product Owner processes all backlog items, dismisses the irrelevant and prioritizes the most important ones into sprints, until the backlog is empty and the project is done. But in practice, that’s not what happens. The backlog just grows forever. It collects items that can wait, together with technical debt and hot potatoes which cannot simply be dismissed. To developers, the backlog is a spillway to keep their job doable. Agile says: whatever you don't know yet, or can do without for now, park it on the backlog, and forget about it. It will reemerge when needed. For the most part, this works. It is the power of Agile. But by the time unsolved problems reemerge, hot potatoes have become too hot to handle, and technical debt has become too expensive to repay. Implementation effort has grown far beyond the available resources. This can be prevented by adding some core insights and making a few small but essential changes to the Agile approach.



Quote for the day:


"Leaders dig into their business to learn painful realities rather than peaceful illusion." -- Orrin Woodward


Daily Tech Digest - October 29, 2019

uncaptioned
As part of any good AI conversation, we have to consider the potential ramifications of an AI-based model. What are the true risks of harnessing AI to help defend ourselves in cyberspace? It is always possible to misuse the information a security system collects. It’s possible to program in unintentional bias. You could break things too much because AI told you to — or you could miss things because you trust your AI system to catch everything. Yet as a business community, we must confront these risks and design to prevent these outcomes. The need for more robust cybersecurity is too great. We simply need to be thoughtful in our approaches, develop and use ethical standards around how we leverage these new and evolving technologies, and, finally, use a trust but-verify-methodology as we look to mature our multilayered cyber-defense strategies. To do this, start by planning ahead and developing a framework for building AI that has preapproved controls in place. Building human review into the decision-making process can go a long way toward preventing major issues. You can also leverage some of the work already being done to manage insider threats and apply that to controlling runaway AI.



Accelerate will enable fintechs to be onboarded to Mastercard in a matter of weeks and provide a guided experience through everything the company can offer. Program participants are connected to relevant parts of the business, to integrate Mastercard’s proprietary technology, leverage its insights and cybersecurity services, engage new customers, and reach new markets and segments. In addition, Mastercard’s commitment to financial inclusion drives focused product development, helping co-create solutions that enable a more inclusive economy. “Mastercard Accelerate is a single doorway to the countless ways Mastercard can help fintechs all over the world grow and scale sustainably,” said Michael Miebach, chief product & innovation officer, Mastercard. “Fintechs are contributing to the rapid digital transformation that makes lives more convenient, simpler, and rewarding. We’re the partner of choice for the top Fintech brands worldwide, and with Accelerate we invite the next generation of global entrepreneurs to join us.” “And for our financial institution partners and customers, Mastercard Accelerate provides access to the next generation of innovators, with a portfolio of start-up partners and fintechs ready to co-create and collaborate on new experiences,” added Miebach.



The temptation to use automated support to cut time and costs comes at the risk of further alienating physicians and other clinicians through IT, rather than making their lives easier. Automation via tools like chatbots and self-service surely “roboticizes” interactions, resulting in a loss in human-to-human contact and a degrading of users’ relationships with the IT staff — and perhaps with the institution itself. Despite all the hype around AI and machine learning, perhaps these technologies will be best embraced by support teams as an extension of their personal services, designed first to enhance the customer experience and only secondarily to ease the support staff’s workload and/or cut costs. If we are smart, we should be able to create a balance between digital and human interaction. Even IT-resistant physicians are learning to appreciate digital solutions if they clearly bring ease and convenience.


broken window with windows logo in clouds
Microsoft warned us at the beginning of the Win10 onslaught four-plus years ago that it wouldn’t dole out patches one by one. Except for emergency security fixes, patches would be released as part of cumulative updates. Over the years, that promise has evolved into a common pace of two cumulative updates per month: the first on Patch Tuesday, and a second “optional, non-security” cumulative update sometime later in the month. It’s one of the ways “Windows as a service” is a service, doncha know. Last month we were treated to an unholy pileup of Windows security patches as Microsoft released, then re-released, then finally pushed a fix to the Internet Explorer zero-day vulnerability known as CVE-2019-1367. Of course, nobody’s seen any widespread exploits attributable to that security hole, but the bugs — three different sets of them, corresponding to the three botched out-of-band patches — were breathtaking. This month, it looks like we’re headed in a similar direction.


Part 1 — what can these three Silicon Valley AI startups do for your business? image
According to Harvard Business School professor Clayton Christensen, each year more than 30,000 new consumer products are launched and 80% of them fail. There is a clear disconnect between product companies and the market. How does it work? The machine learning, natural language processing and visual AI models developed by Commerce.AI analyses unstructured customer feedback or data in the form of text, image, voice and video from reviews, and social media to a lesser extent. “We take unstructured data and synthesise it using AI, NLP and visual AI to create product intelligence for approximately 56,000 product categories,” explained Pandharikar. ... It’s all about improving product development and management; using AI/ML to identify the features that are working and build that into the next product, while taking positive feedback from millions of reviews and using that in the next generation of products. “The old way was to make consumers buy products, now it’s about making products that consumers want,” said Pandharikar.


Speaking at TechCrunch Disrupt SF, Jeanette Manfra, the assistant director for cybersecurity for Homeland Security’s Cybersecurity and Infrastructure Security Agency (CISA), said that the agency was making training for new cybersecurity professionals a priority. “It’s a national security risk that we don’t have the talent regardless of whether it’s in the government or the private sector,” said Manfra. “We have a massive shortage that is expected that will grow larger.” Homeland Security is already responding, working on developing curriculum for potential developers as soon as they hit the school system. “We spend a lot of time invested in K-12 curriculum,” she said. The agency is also looking to take a page from the the tech industry’s playbook and developing a new workforce training program that’s modeled after how to recruit and retain individuals. For Manfra, it’s important that the tech community and the government agencies tasked with protecting the nation’s critical assets work more closely together



With the emergence and the implementation of blockchain technology in the Australian financial domain, developers and entrepreneurs can put their creative minds to good use and produce even more innovative games for gambling lovers. This has been brought up due to a well-known relationship between the total revenue and a notable contribution that gambling holds to it. Not only will the peer-to-peer gaming become a reality opening the door for mutual betting, decentralized lotteries and other categories of games, but Aussie high roller casinos will also benefit sizably. One of the main attractions of using Blockchain technology will be the improved degree of trust between players and operators. Every game rule, underlying code, and the outcome will be enabled to verification and thus, enhanced safety and security will be guaranteed. In addition, it will not be too big of a hurdle for Blockchain to gain support rapidly in Australia since many reputable online casinos allow Australian punters to wager, withdraw and deposit in bitcoins.


cybersecurity awareness month
“Lock your devices up. Make backups. Stay on top of your accounts.” – Ivanti Insider “Be vigilant and be up-to-date. Verify you're typing in the correct web address. Before you click anything in an email, verify the sender is who you think it is and the link/attachment is something they themselves sent. Verify that your antivirus products are up-to-date (and that you have one installed!) and scanning, and that your PC is staying up-to-date with patches. Most issues can be avoided by being careful to always visit legitimate sites, ensuring you aren't opening attachments from unknown individuals, by keeping your PC patched, and your antivirus up-to-date and performing regular scans.” – Kelly Ruston, Technical Support Specialist, William Osler Health System “When you are going to click a link on a webpage or an email, hover over the link first and check the bottom left of your browser to see if it will take you to the page you are expecting.” – Adam Howard, Systems Administrator, Rack Room Shoes


Linx prosthetic
We’re walking in all sorts of different terrains, and the body is naturally adjusting itself, and the way legs move so that you can get around with the least amount of energy possible. The Linx needs to accommodate changes to the environment in a biomechanical way so that users don’t exert so much energy. And that’s a difficult problem because you’re dealing with such a huge variation. If you imagine the activities of daily living — every move that you do — thousands and thousands of steps — how do you detect these changes and accommodate them? The way to do it is to integrate the components of the limb. Rather than looking at products or joints individually, you can leverage all that new information. And what you end up with is a leg which behaves in a much more natural way — able to predict and move in a coordinated way. That’s another important thing. And in our limbs, the movement is very coordinated. Essentially our microprocessor foot would be making its own decisions in complete awareness of what a microprocessor knee would be doing.



A failure to publish an event can mean critical failure to the business process. To explain the problem statement better, let’s consider a Student microservice that helps Enroll the student. After enrollment, the "Course Catalog" service, emails the student all the available courses. Assuming an Event-Driven application, the Student microservice enrolls the student by inserting a record in the database and publishes an event stating that the enrollment for the student is complete. ... This pattern provides an effective solution to publish events reliably. The idea of this approach is to have an “Outbox” table in the service’s database. When receiving a request for enrollment, not only an insert into the Student table is done, but a record representing the event is also inserted into the Outbox table. The two database actions are done as part of the same transaction. An asynchronous process monitors the Outbox table for new entries and if there are any, it publishes the events to the Event Bus. The pattern merely splits the two transactions over different services, increasing reliability.



Quote for the day:



"Leverage is the ability to apply positive pressure on yourself to follow through on your decisions even when it hurts." -- Orrin Woodward


Daily Tech Digest - October 28, 2019

A Century of Healthcare Data

Firstly, there is the length of time that patient data will have to be preserved. People are now living longer than ever before, and current UK legislation states that GP records must be retained for ten years after the death of a patient. This means healthcare data being created today may need to be kept on file for up to 100 years or more. Secondly, the rate of technological development means this data may also have to be migrated between formats multiple times over its lifespan – which is both labour-intensive and expensive. Data storage technology and organisational priorities will continue to evolve, while the data itself will typically come from various sources. As a result, healthcare organisations will face a huge amount of complexity when it comes to preserving data and making it accessible, on top of the growing costs involved as data scales. Medical organisations therefore need to ensure that their storage infrastructure provides the highest possible scalability, flexibility and portability – especially with data volumes becoming so vast that just migrating data from one format or provider to another can require significant investment. 


UN, UNICEF, Red Cross officials targeted in recent phishing campaign

united nations UN
"We can't speculate on attribution," Jeremy Richards, principal security researcher at Lookout, told ZDNet in an email this week. "The motive of the attack is to compromise Okta and Microsoft credentials to gain access to these accounts, which could be used for further attacks or intelligence gathering." A member of a human rights advocacy group told this reporter in an encrypted chat this week that organizations such as his or the ones listed in the Lookout report are often the targets of all sorts of groups. State-sponsored groups want to breach human rights organizations to learn of any ongoing investigations, to track local or abroad whistleblowers, or gain intelligence on organization members to harass them at later points. Similarly, human rights groups are also regularly targeted by regular financially-motivated hackers, such as BEC (business email compromise) scammers, who want to hijack payments or steal funds. "It's no difference to them if we're a hardware vendor or NGO. All they want is the money," our source told us.


The best free photo editor 2019

The best free photo editor
There are dozens of free photo editors out there, so we've hand-picked the very best so you can make your pictures look amazing without paying a penny. Of course, if you're able to wait until Black Friday and Cyber Monday, you'll almost certainly be able to find a great deal on a premium photo editor like Adobe Photoshop, but there's plenty of choice out there if you can't wait that long. We've spent hours putting a huge range of photo editors to the test, and picked out the best ones for any level of skill and experience. From powerful software packed with features that give Photoshop a run for its money to simple tools that give your pictures a whole new look with a couple of clicks, there's something for everyone. Many free photo editors only offer a very limited selection of tools unless you pay for a subscription, or place a watermark on exported images, but none of the tools here carry any such restrictions. Whichever one you choose, you can be sure that there are no hidden tricks to catch you out.


AI Policies Are Setting Stage To Transform Healthcare But More Is Needed

AI
The President signed an Executive Order in February 2019 setting the tone for improved data connectivity and stronger public-private partnerships to spark new products in the marketplace and inspire entrepreneurs. It highlights the need for better ways to connect the vast amounts of data that need to be sorted and ultimately harnessed for patients’ benefit. The Initiative mandates that heads of government research agencies like the National Institutes of Health (NIH) develop and identifying new AI programs, explore collaborations with the private sector and help train new generations of data scientists. AI hungers for data and the Initiative helps focus efforts on better methods to connect the countless dark pockets that are inaccessible or hoarded by some organizations. Connectivity is a considerable problem. Healthcare data is expected to double three times each year, leading to zettabytes of information which is utterly impossible to process using historical standards.


Why good strategies fail

abstract
Much has been written in management books, white papers, and news articles about how to craft a winning strategy. Scholars, strategy executives, management consultants, and business gurus alike, all have a formula for how to identify opportunities to advance an organization’s aspiration, architect a plan of attack, orchestrate resource allocation, and coordinate execution of priority initiatives. Developing a well-crafted strategy takes time, effort, money, intellectual commitment, and political capital. If you have ever led or participated in a strategic planning process, you know the drill. But what happens when your strategy doesn’t work as intended? What happens when your strategy falls short of delivering the expected results? The question of "why isn’t my strategy working?" is asked more often than many executives would like to admit. Yet, there is very little in the strategy literature that can help companies troubleshoot their strategy execution, isolate the causes of friction, and deploy mitigating and corrective actions. In this article, we aim to bridge that gap. We explore three critical strategic tensions—incoherence, incongruence, and inconsistency—their root causes, how to identify them, and how to make sure that they don’t prevent your strategy from realizing its full potential.


Psst! Wanna buy a data center?

data center / server racks / connections
Since then there have been numerous sales of data centers under better conditions. There are even websites that list data centers for sale. You can buy an empty building, but in most cases, you get the equipment, too. There are several reasons why, the most common being companies want to get out of owning a data center. It's an expensive capex and opex investment, and if the cloud is a good alternative, then that's where they go. But there are other reasons, too, said Jon Lin, president of the Equinix Americas office. He said enterprises have overbuilt because of their initial long-term forecasts fell short, partially driven by increased use of cloud. He also said there is an increase in the amount of private equity and real estate investors interested in diversifying into data centers. ... Enterprises do look to sell their data centers, but it's a more challenging process. She echoes what Lin said about the problem with specialty data centers. "They tend to be expensive and often in not great locations for multi-tenant situations. They are often at company headquarters or the town where the company is headquartered. So they are hard to sell," she said.


How 5G Will Revolutionise Retail Payments


The launch of 5G will provide more internet access, currently there is only a 49% global internet penetration. This will lead to more online consumers worldwide and create even faster websites. Broken down this is a 10X decrease in latency and up to 100X more network efficiency. Advancements with 5G will allow for easier online shopping experiences to an even broader spectrum of digital consumers. In fact, Adobe reports 5G will boost e-commerce revenue by $12 billion by 2021. Offering mobile adapted e-wallets will prepare retailers to take advantage of this trend. After 5G, consumers will truly be able to pay and shop wherever and whenever they want to, with little resistance and receive instant confirmation of their purchases. Merchants should see a boost in revenue due to even more seamless mobile shopping. A combination of merchant and shopper apps and faster 5G speeds will cause consumers to naturally move towards mobile commerce.


The rise of the platform economy in financial services

Industry 4.0 promises to herald in a new era of platform players delivering products and services designed to accurately meet the personalized needs of customers in a more tailored way throughout their lives. So what’s the perspective of Xavier Gomez @Xbond49 on this new era? “PSD2 rules clearly push banking sectors to renovate the customers relationship (B2B and B2C) for lower cost. The first wave of APIs in Europe was disappointing in terms of quality of data provided. Why? Banks produce a lot of data but they do not know how to use and leverage it unlike the GAFA. Open banking is an opportunity to build new banking services that are customizable to customers thanks to the “platformization” concept. Banks can apply a digital transformation policy by rebuilding a new IT core banking system (open source), by integrating fintech solutions and collaborating with start-ups.”


Nasty PHP7 remote code execution bug exploited in the wild

php.png
The vulnerability is a remote code execution (RCE) in PHP 7, the newer branch of PHP, the most common programming language used to build websites. The issue, tracked as CVE-2019-11043, lets attackers run commands on servers just by accessing a specially-crafted URL. Exploiting the bug is trivial, and public proof-of-concept exploit code has been published on GitHub earlier this week. "The PoC script included in the GitHub repository can query a target web server to identify whether or not it is vulnerable by sending specially crafted requests," says Satnam Narang, Senior Security Response Manager at Tenable. "Once a vulnerable target has been identified, attackers can send specially crafted requests by appending '?a=' in the URL to a vulnerable web server." ... But there are also website owners who cannot update PHP or can't switch from PHP-FPM to another CGI processor due to technical constraints.


Facebook alters video to make people invisible to facial recognition


Facebook’s approach pairs an adversarial autoencoder with a classifier network. As part of training of the network, researchers tried to fool facial recognition networks, Facebook AI Research engineer and Tel Aviv University professor Lior Wolf told VentureBeat in a phone interview. “So the autoencoder is such that it tries to make life harder for the facial recognition network, and it is actually a general technique that can also be used if you want to generate a way to mask somebody’s, say, voice or online behavior or any other type of identifiable information that you want to remove,” he said. Like faceswap deepfake software, the AI uses an encoder-decoder architecture to generate both a mask and an image. During training, the person’s face is distorted then fed into the network. Then the system generates distorted and undistorted images of a person’s face for output that can be embedded into video.



Quote for the day:


"Being honest and open is the only way to convince cynical employees that you truly want to establish a partnership with them." -- Florence M. Stone