Daily Tech Digest - November 03, 2019

Can a Smart Light Bulb Steal Your Personal Data?


Jadliwala believes that smart bulbs may be poised to become an even more attractive target for data privacy exploitation, even though they are embedded with very simple chips. Smart bulbs connected to a home network rather than a smart home hub — a centralized hardware or software device where other loT products communicate with each other — are especially easy to target. If these bulbs are infrared-enabled, hackers can send commands via the invisible infrared light emitted by the bulbs. These commands can be used to hack into other IoT devices on the home network to steal data. Moreover, the victim would likely not notice such hacking because the commands would be transmitted within the owner’s home Wi-Fi network, where they might not be detected by Internet-based security systems. Jadliwala says smart bulbs connected to dedicated home hubs are currently safer alternatives because they do not access any Wi-Fi networks, but he also believes smart bulb manufacturers will have to ramp up their security measures to limit the level of access such bulbs might have to other smart home appliances within a home system.



How Artificial Intelligence Will Take the Industrial Internet of Things to New Heights


The ideology is simple in the industrial sector as well: making industrial machines smarter than humans at analyzing data in real-time and forming the basis of faster and better logical decisions. A connected machinery system of this capability ensures that management can pick up errors or inefficiencies in the system, formulate better solutions and implement them faster. Making industrial processes smarter with IIoT also brings great environmental benefits to the table: better quality control, eco-friendliness, sustainability and better industrial waste management. IIoT also helps in supply chain management, the entire process of raw material conversion into a product and it’s upkeeping from the point of origin to the point of consumption. In the Industrial sector, predictive maintenance and analytics aren’t possible without proper IIoT infrastructure, as well as enhanced asset tracking and energy management for better power utilization. IIoT manages and controls all these processes with an integrated system of smart and intelligent devices ensuring perfect maintenance and management with less dependence on human action.



4 Ways to Ensure a Successful Analytics Tool Integration


Data is the foundation of analytics and decision-making, and analytics is about making sense of the available data. A stable CRM platform is necessary before deploying advanced analytics. If the quality of the data inside the systems isn’t good, the results will be unpredictable. Data used in analytics tools have to be current, usable, and actionable. The CRM system data may be sales-focused, but it might not be collecting the data needed by other departments. Striving for quick results may overshadow the need for higher data quality, accuracy, and reliability. Quality management and ethical data sourcing, entry, and retrieval should be combined with continual quality testing and improvement, which ultimately leads to increased value. Consider the appointment of chief data officers and chief analytics officers. Also, don’t overlook the demand for security, as privacy threats and public concern increase. While data analytics tools are helpful, but they are nothing without a strong team and a big data management team with data scientists from different teams.


Overcoming The Barriers To Conversational AI

smart speaker
Conversational AI is an incredibly hard problem to solve. The advances made so far, however, have been nothing short of staggering. One of the first voice recognition devices was Shoebox, an IBM device introduced at the 1962 Seattle World Fair that could recognise 16 spoken words. Currently, all major platforms are reporting recognition error rates below 5 per cent, which is more than enough to call voice recognition a viable technology. Of course, conversational AI is much more than just converting speech to words. In many ways, the real challenge comes after that. The device needs to understand the context of the conversation both at a global level (the user’s ultimate goal) and within different stages of the conversation (the tasks to be achieved in each step of a process). This is where the current challenges lie. Advances have been rapid and impressive but people are still reporting their frustration with chatbots and intelligent voice assistants because they are “just stupid” or they “don’t understand what I am asking”.


AI May Not Kill Your Job—Just Change It


Fleming is optimistic about what AI tools can do for work and for workers. Just as automation made factories more efficient, AI can help white-collar workers be more productive. The more productive they are, the more value they add to their companies. And the better those companies do, the higher wages get. “There will be some jobs lost,” he says. “But on balance, more jobs will be created both in the US and worldwide.” While some middle-wage jobs are disappearing, others are popping up in industries like logistics and health care, he says. As AI starts to take over more tasks, and the middle-wage jobs start to change, the skills we associate with those middle-class jobs have to change too. “I think that it’s rational to be optimistic,” says Richard Reeves, director of the Future of the Middle Class Initiative at the Brookings Institution. “But I don’t think that we should be complacent. It won’t just automatically be OK.” The report says these changes are happening relatively slowly, giving workers time to adjust. But Reeves points out that while these changes may seem incremental now, they are happening faster than they used to.


Why The EU Is About To Seize The Global Lead On Cybersecurity

US EU G8
The European Commission has made cybersecurity a “high priority” and proposed that the cybersecurity budget for 2021-27 include €2 billion to fund “safeguarding the EU's digital economy, society and democracies through polling expertise, boosting EU's cybersecurity industry, financing state-of-the-art cybersecurity equipment and infrastructure.” Additional funding will come from Horizon Europe, a €100 billion research and innovation program. The EU’s commitment is not just about the security of critical infrastructure and combating cybercrime. The EU has seen how America’s IT sector has driven the U.S. economy, and it wants part of the action. This desire is clearly at play throughout the EU Cybersecurity Act. The first sentence of the Act states, “Network and information systems and electronic communications networks and services play a vital role in society and have become the backbone of economic growth.” The EU is committed to becoming “a leader in the next-generation cybersecurity and digital technologies.”


IBM: AI will change every job and increase demand for creative skills


“As new technologies continue to scale within businesses and across industries, it is our responsibility as innovators to understand not only the business process implications, but also the societal impact,” said Martin Fleming, vice president and chief economist of IBM, in a statement. “To that end, this empirical research from the MIT-IBM Watson AI Lab sheds new light on how tasks are reorganizing between people and machines as a result of AI and new technologies.” With the rise of AI and automation, there has been growing debate and anxiety about how these trends will disrupt current job markets. While some have argued AI and automation will be job killers, others have said the emerging technology will be a net creator of new jobs. The IBM-MIT study offers a bit of nuance to that discussion. The researchers used machine learning to analyze 170 million U.S. job postings between 2010 and 2017. They found that out of 18,500 possible tasks employees might be asked to do on average, the number had fallen by 3.7 over seven years. A drop, though hardly radical.


3 ways business intelligence can hurt your projects


The information gathered, analyzed, and reported is only useful if the individuals collecting it understand what they are looking for, why this information is relevant, where and how to search, and how to interpret the BI in a meaningful way. It is also essential to know who should have access to the information and how to deliver it in a timely manner. ... When planning for marketing-related projects, some companies, especially smaller ones, may only see financial and team-based performance data as valuable. Large numbers of customers or potential customers now interact online through social media, website content, and online advertising, which can play a significant role in trends, future activities, and spending habits.. ... Over-restricting business intelligence can result in IT department resource overload, decreased cross-functional productivity, reduced employee satisfaction, a decreased sense of trust, and low morale. While it is vital to restrict access based on a user's role, it is equally important to ensure that teams have the power to access and report on information without being hindered by bureaucracy.


AI and Health Care Are Made for Each Other


AI could also reduce physician burnout and extend the reach of doctors in underserved areas. For example, AI scribes could assist physicians with clinical note-taking, and bots could help teams of medical experts come together and discuss challenging cases. Computer vision could be used to assist radiologists with tumor detection or help dermatologists identify skin lesions, and be applied to routine screenings like eye exams. All of this is already possible with technology available today or in development. But AI alone can’t effect these changes. To support the technical transformation, we must have a social transformation including trusted, responsible, and inclusive policy and governance around AI and data; effective collaboration across industries; and comprehensive training for the public, professionals and officials. These concerns are particularly relevant for health care, which is innately complex and where missteps can have ramifications as grave as loss of life.


Android bug lets hackers plant malware via NFC beaming

Android NFC
Google patched last month an Android bug that can let hackers spread malware to a nearby phone via a little-known Android OS feature called NFC beaming. NFC beaming works via an internal Android OS service known as Android Beam. This service allows an Android device to send data such as images, files, videos, or even apps, to another nearby device using NFC (Near-Field Communication) radio waves, as an alternative to WiFi or Bluetooth. Typically, apps (APK files) sent via NFC beaming are stored on disk and a notification is shown on screen. The notification asks the device owner if he wants to allow the NFC service to install an app from an unknown source. But, in January this year, a security researcher named Y. Shafranovich discovered that apps sent via NFC beaming on Android 8 (Oreo) or later versions would not show this prompt. Instead, the notification would allow the user to install the app with one tap, without any security warning. While the lack of one prompt sounds unimportant, this is a major issue in Android's security model.



Quote for the day:


"Your greatest area of leadership often comes out of your greatest area of pain and weakness." -- Wayde Goodall


Daily Tech Digest - November 02, 2019

Creating an agile mind-set at PepsiCo


Employees are expected to continuously learn new skills. They are expected to question practices and reduce or eliminate habits that are no longer useful. Time is allotted each week for every employee to “upgrade” themselves, and a large catalog of training materials and classes is available. Underpinning this effort is a belief that time is the most democratic and precious resource and that people can make much better use of it to be more productive at work and have more time outside of work. That is why the company has found ways to give employees back some time to innovate and better serve customers during working hours. Then, with work at the office streamlined and a culture that encourages disconnecting from the workplace during off-hours, employees no longer feel like they have to take time away from family during evenings and vacations to address work issues. Employees are encouraged to look carefully at how they spend their time in the office.



Tim Cook thinks Apple customers are rich and very sensitive

Now, though, there appears to be a new divide. Those who have one pair of AirPods and those who have two. For particular occasions, that is. This week's Apple earnings call happened to coincide with the launch of the AirPods Pro -- elevated, noise-canceling versions of Apple's earrings-gone-wrong buds. CEO Tim Cook was moved to discuss these new apparitions and who would buy them. He offered: "We're anxious to see the customers for the new AirPod Pro. But I would guess that one, particularly in the early going, will be people that have AirPods today and want to also have a pair for the times they need noise cancellation." Please forgive me if I'm anxious to immerse myself in a vat of cooling coconut balm and hum my calming meditations. Apple's CEO believes his customers are so wealthy and so very sensitive that they will take time to consider: "Hmm, is this a moment when I want to shut the world out? Or would my central nervous system prefer to hear a few tinges of intonation from the world outside?"


What goes into a user story vs. use case for Agile development?


A user story provides a short descriptive sentence that outlines the who, what and why of one or a set of software requirements. User stories put context around interactions, which enables developers to focus their efforts on perspectives, features, functionality and results. ... User stories are not ideal for every software development discussion. While user stories are quick and simple, they are often devoid of technical detail; that leaves developers with no discussion of how to accomplish a task. There is no assessment of relative difficulty, accounting for resources like developer hours, or prioritization of one user story vs. another. Project managers often make these assessments during the planning phase of each iteration. ... Use cases generally provide more detail and a deeper understanding of functional behaviors to contextualize a software requirement. Use cases help development teams define or discuss user interface designs, database access or query processes, and API communications. Group use cases together to organize them for complex projects.


Google CEO Sundar Pichai on achieving quantum supremacy

You would need to build a fault-tolerant quantum computer with more qubits so that you can generalize it better, execute it for longer periods of time, and hence be able to run more complex algorithms. But you know, if in any field you have a breakthrough, you start somewhere. To borrow an analogy—the Wright brothers. The first plane flew only for 12 seconds, and so there is no practical application of that. But it showed the possibility that a plane could fly. ... Google wouldn’t be here today if it weren’t for the evolution we have seen in computing over the years. Moore’s Law has allowed us to scale up our computational capacity to serve billions of users across many products at scale. So at heart, we view ourselves as a deep computer science company. Moore’s Law is, depending on how you think about it, at the end of its cycle. Quantum computing is one of the many components by which we will continue to make progress in computing. The other reason we’re excited is—take a simple molecule. Caffeine has 243 states or something like that.


Wireless noise protocol can extend IoT range

Internet of Things (IoT) / security alert / wireless network management
The on-off noise power communication (ONPC) protocol, as it’s called, works via a software hack on commodity Wi-Fi access points. Through software, part of the transmitter is converted to an RF power source, and then elements in the receiver are turned into a power measuring device. Noise energy, created by the power source is encoded, emitted and picked up by the measuring setup at the other end. “If the access point, [or] router hears this code, it says, ‘OK, I know the sensor is still alive and trying to reach me, it’s just out of range,’” Neal Patwari of Washington University says in a Brigham Young University (BYU) press release. “It’s basically sending one bit of information that says it’s alive.” The noise channel is much leaner than the Wi-Fi one, BYU explains. “While Wi-Fi requires speeds of at least one megabit per second to maintain a signal, ONPC can maintain a signal on as low as one bit per second—one millionth of the data speed required by Wi-Fi.” That’s enough for IoT sensor housekeeping, conceivably. Additionally, “one bit of information is sufficient for many Wi-Fi enabled devices that simply need an on [and] off message,” the school says.


MIT-IBM Watson AI Lab: Robots will take over parts of your job, not all of it


Casey said business leaders and government officials should pay attention to recommendations from another researcher studying automation, Carl Benedikt Frey. Frey made an initial prediction about 47% of jobs being at high risk for automation and is quoted in the MIT-IBM research. As Frey stated in his initial automation research, business process and technology investment, regulatory concerns, political pressure, and social resistance will determine how automation affects jobs and wages. Frey's latest thinking is that the true concern is not about automation in general but that the revolution won't go far enough. The incomplete technology transformation will trap workers in a permanently unequal income distribution. If businesses only go so far toward automation, the full productivity benefit will not be realized. Casey said that the goal is to get to a point in the machine-learning revolution at which technology is creating new tasks and jobs for people to do. "What they're worried about is we'll get stuck at a place where there's nothing that could be transformative enough to create new tasks and create new jobs," he said. "What we want is sufficiently transformative tech that raises productivity enough so that new tasks emerge."


Google agrees to buy Fitbit in $2.1B deal to help boost Wear OS


While Fitbit's software is "solid ... it will be interesting to see how long Google keeps Fitbit separate or if it tries to integrate its apps into Android," Greengart adds. He notes while Google's promise on user data "is promising," the company's "users will have to trust that it stays that way." Alphabet released a sluggish financial report Monday, with $40.49 billion in sales, exceeding analysts' estimate of $40.32 billion, and earnings per share of $10.12, below the expected $12.42 per share. We can expect to see Fitbit's third-quarter earnings report on Nov. 6, the company said last month. Putting a dampener on the news for Google, however, House Antitrust Subcommittee Chair David Cicilline later Friday said the acquisition announcement has triggered more antitrust concerns as the tech giant's "dominance" is already being investigated. "By attempting this deal at this moment, Google is signaling that it will continue to flex and expand its power in spite of this immense scrutiny," Cicilline said in a statement. The acquisition would also give Google "deep insights into Americans' most sensitive information," including health and location data, according to Cicilline.


New 'unremovable' xHelper malware has infected 45,000 Android devices

Android malware botnet
Named xHelper, this malware was first spotted back in March but slowly expanded to infect more than 32,000 devices by August (per Malwarebytes), eventually reaching a total of 45,000 infections this month (per Symantec). The malware is on a clear upward trajectory. Symantec says the xHelper crew is making on average 131 new victims per day and around 2,400 new victims per month. Most of these infections have been spotted in India, the US, and Russia. According to Malwarebytes, the source of these infections is "web redirects" that send users to web pages hosting Android apps. These sites instruct users on how to side-load unofficial Android apps from outside the Play Store. Code hidden in these apps downloads the xHelper trojan. The good news is that the trojan doesn't carry out destructive operations. According to both Malwarebytes and Symantec, for most of its operational lifespan, the trojan has shown intrusive popup ads and notification spam. The ads and notifications redirect users to the Play Store, where victims are asked to install other apps -- a means through which the xHelper gang is making money from pay-per-install commissions.


The changing role of the enterprise architect


The need to focus on operational efficiency has diminished the EA’s role as a pan-organisation technology strategist. To address this and the needs of modern organisations, the current singular EA role must be devolved into its three component parts, eliminating the constraints it has experienced over the past decades, which have limited its strategic value to the organisation. These separate roles – strategist, engineer and custodian – must also reside and operate permanently in corporate strategy, the programme office, and the IT department, respectively. So how do we broadly define these roles? The Strategist role acts as a positive change agent, assessing outlier and newly adopted technologies to propose how their use can serve the corporate leadership’s vision, at the start of a business strategy’s development.... The technology engineer role is responsible for creating project technology designs that fit the business strategy. From the point of drafting to final design, the engineer consults with both the strategist and custodian roles.


Implement Agile IT Strategic Planning With Enterprise Architecture


In this modern age, Digital Transformation continues to be a priority for company executives. They know that Artificial Intelligence (AI), Blockchain, Internet of Things (IOT), and Big Data are driving their ability to improve customer experience, stay ahead of the competition and generate business growth. However, with IT teams entrenched in managing day-to-day technology, it is difficult for IT to stay abreast of the strategic discussions occurring at the business level and proactively plan for associated IT upgrades, modifications, or new systems. This disconnect can result in a lagging approach to IT planning especially as business decisions are made in fast-moving agile environments. To remedy this, companies need a holistic approach that connects business and technology. Enterprise Architecture (EA) is the key to this foundation as it helps companies improve their IT Strategic Planning by helping companies precisely see and understand how IT systems support business objectives. An IT Roadmap that is built on foundational Enterprise Architecture yet designed to realize business outcomes enables a company to assess the impact of change on the existing IT landscape and therefore quickly adjust as needed.



Quote for the day:


"Teamwork is the secret that make common people achieve uncommon result." -- Ifeanyi Enoch Onuoha


Daily Tech Digest - November 01, 2019

Use Chrome? Update Your Browser Immediately


Users of Google's Chrome web browser are being urged to install the latest update immediately to patch two security vulnerabilities, one of which is already being exploited in the wild. As the National Cyber Security website reports, the two high severity vulnerabilities are known as CVE-2019-13720 and CVE-2019-13721 and classed as "use-after-free" vulnerabilities. That means they allow for data in memory to be corrupted by a remote hacker and then the execution of arbitrary code allowed. In other words, they allow for a PC to be hijacked. One of the vulnerabilities is to do with Chrome's audio component, while the other is for the PDFium library, which Chrome uses for PDF document generation and rendering. Kaspersky researchers Anton Ivanov and Alexey Kulaev have already detected the audio component compromise being used in the wild, hence the urgency for users to update. The latest version of Chrome released today to fix the security vulnerabilities is version 78.0.3904.87 and it's available for Windows, Mac, and Linux.



How powerful people slip


Studies have found, for example, that high levels of relative power often correspond with increased neural activity in the brain’s behavioral activation system (BAS). BAS is a pattern of neural circuits posited by psychologist Jeffrey Alan Gray in 1970 as an explanation for how the brain processes the experience of rapid reward. Nestled deep in the brain, these circuits include the basal ganglia and parts of the prefrontal cortex. They have been known to release the neurotransmitter dopamine, associated with pleasure. If you are a leader, the increase in BAS activity produced by the power of your role can make you more effective in noticeable ways — specifically, by increasing your attention to goal-relevant information, your comfort with innovation and risk taking, and your ability to think at a visionary level. Gray and subsequent psychologists have also posited that when the BAS is engaged, another system, called the behavioral inhibition system (BIS), tends to be more idle. The BIS, generally associated with in the brain’s septohippocampal system, is associated with feelings of anxiety, sensitivity to punishment, frustration, and risk aversion.


Why HR professionals need to adapt to new technology


HR technologies are being invested in, now more than ever, by a myriad of businesses. The 2019 HR Technology Market Report outlines some key findings on this front – investments into HR technology have increased by 29 per cent, resulting in the market for HR technologies growing by a noteworthy 10 per cent. Also highlighted were new trends towards artificial intelligence, a shift away from engagement towards productivity in core systems, and the recognition of the role the gig workforce plays. Artificial intelligence is far more than a buzzword designed to impress the board of directors when it comes to HR. Unilever offers a prime example of this – given that it recruits upwards of 30,000 individuals a year, it should come as no surprise that a significant amount of capital and manpower must be devoted to sifting through applications to identify the best people for the job. This changed dramatically with its AI-powered solution: partnering with Pymetrics, the business developed a platform that would test the candidate’s aptitude, and even process 30-minute interview videos, using natural language processing and body language analysis to assess their suitability for a given role.


As devices generate more data, AI is becoming indispensable for medtech


While technology companies often have sophisticated AI capabilities, medtech companies have deep expertise in the clinical development of medical algorithms, such as translating data from an EKG lead into meaningful output that a physician can use. This clinical expertise and credibility with physicians could be useful to potential con­sumer tech partners. Moreover, consumer technology companies’ data sci­ence and AI expertise— combined with medtech’s ability to develop meaningful medical applications and algorithms—could lead to powerful offerings that will improve patient health. ... Regulators are working to develop regulatory guardrails as AI applications take off in medtech. Earlier this month, the US Food and Drug Administration (FDA) released a draft framework detailing the types of AI/machine learning-based algorithm changes in medical devices that might be exempt from pre-market submission requirements.9 As part of the Consumer Technology Association’s AI initiative, AdvaMed, Google, Doctor On Demand and other organizations will work to develop standards and best practices for AI use cases in medicine and health.


How 5G Will Drive The Future Of Industry 4.0


5G can also assist manufacturers in optimising their operations by using IoT sensors to monitor the performance of equipment and workers so improvements in working processes can be identified. In fact, research from IDC found that IoT technology can boost productivity in the supply chain by 15%. Utilising IoT-based monitoring can also enable predictive maintenance, reducing overall maintenance costs by up to 30%, says Accenture. What’s more, the incredibly low latency offered by next-generation connectivity can enable remote operation of equipment. This enables automation of machinery and the use of untethered robots, helping to make factories safer. 5G infrastructure can also help unlock actionable insights from the vast amounts of data generated by the ever-growing number of connected devices. Data analytics can bring operational efficiencies and cost savings while logistics can also be enhanced with real-time tracking data. Many manufacturing businesses will make use of private, on-premise 5G networks.


Agile and late! End-to-end Delivery Metrics to Improve your Predictability


The key delivery metrics require surfacing data from a myriad of sources including; work-flow management tools, code repos, and CI/CD tools – as well as collecting quant feedback from the engineering team themselves (via collaboration hubs). The complexity of the data and multiple sources make this sort of data collection very time consuming to do manually and really requires an end-to-end delivery metrics platform to do at scale. Delivery metrics platforms are available which consist of a data layer to collate and compile metrics from multiple data sources and a flexible UI layer to enable the creation of custom dashboards to surface the metrics in the desired format. ... Done well, Root Cause RAG Reports can be a really effective means of presenting our (more accurate) forecasts in a way that stakeholders can understand and therefore can be an important step in reducing lateness and bringing the technology team and the internal client much closer together.  As discussed however, it relies on an understanding of the metrics that actually determine project lateness and a means of collecting those metrics.


Open source technology, enabling innovation


Open source allows people to collaborate and promotes a meritocracy of ideas In doing so, Kubernetes helps companies harness processing power and run their software more efficiently no matter how many machines they have and no matter how many competing cloud services they’re using. This is especially useful for companies without a refined IT service as it makes managing commercial software cloud servers much less of a headache. These abilities are all underpinned by open source code so it enables a company to build a system tailored to their needs, which will evolve as it becomes more successful and expands its operations. Originally open sourced by Google in 2014, Kubernetes has remained relevant technology because of the open source community that supports it and it’s consistently one of the top projects on GitHub, the open source cloud server used by developers to store and manage code. Twitter, Huawei, Intel, Cisco and IBM are just some of the businesses that have been involved in its development over the years thanks to the fact that Google donated it to the Cloud Native Computing Foundation, a collective of open source development advocates.


10 tips for effective change management in Agile

Agile software development cycle.
Non-Agile methodologies make an implicit assumption that requirements are final and that a change management process can accommodate only minor variations in them. Design requirements, also called acceptance criteria, are subject to constant, planned change in Agile iterations. Agile enables product managers to demonstrate working software and elicit customer feedback. If the user needs aren't met, the product owner and developers make change requests to the application code, and possibly alter the delivery schedule. Thus, change management is an inherent part of the Agile software development process. The ability to demo working applications means you can design for customer expectations. Rather than create and develop an application workflow based on only written requirements or feature descriptions, keep the customer informed of the application and its functionality. If a development team spends six months working on an app and delivers it on time to the customer, that's a good thing -- as long as that application aligns with the customer's expectations. If it doesn't meet user needs, the delivery is not successful. Keep the customer in the loop and manage requirement changes accordingly for long-term application success.


Big Four carriers want to rule IoT by simplifying it

IoT | Internet of Things  >  A web of connected devices.
The carriers’ approach to the IoT market is two-pronged, in that they sell connectivity services directly to end-users as well as selling connectivity wholesale to device makers. For example, one customer might buy a bunch of sensors directly from Verizon, while another might buy equipment from a specialist manufacturer that contracts with Verizon to provide connectivity. There are, experts agree, numerous advantages to simply handing off the wireless networking of an IoT project to a major carrier. Licensed networks are largely free of interference – the carriers own the exclusive rights to the RF spectrum being used in a designated area, so no one else is allowed to use it without risking the wrath of the FCC. In contrast, a company using unlicensed technologies like Wi-Fi might be competing for the same spectrum area with half a dozen other organizations. It’s also better-secured than most unlicensed technologies or at least easier to secure, according to former chair of the IEEE’s IoT smart cities working group Shawn Chandler. Buying connectivity services that will have to be managed and secured in-house can be a lot more work than letting one of the carriers take care of it.


Should you go all-in on cloud native?

Should you go all-in on cloud native?
The second school of thought is that we might add too much complexity by going all-in native. Although there are advantages, moving to Kubernetes-native systems means having at least two of everything. Enterprises moving to Kubernetes-driven, container-based applications are looking for a common database system, one that spans applications inside and outside Kubernetes. Same with security, raw storage, and other systems that may be native to the cloud, but not Kubernetes. What’s the correct path? One of the lessons I’ve learned over the years is that best-of-breed and fit-to-purpose technology is typically the right way to go. This means native everything and all-in native, but you still need to be smart about picking solutions that will work longer term, native or not. Will there be more complexity? Of course, but this is really the least of your worries, considering the movement to multiclouds and IoT-based applications. Things will get complex out there no matter if you’re using a native Kubernetes solution or not. We might as well get good at complexity, and do things right the first time.



Quote for the day:


"Real leaders are ordinary people with extraordinary determinations." -- John Seaman Garns


Daily Tech Digest - October 31, 2019

What the Google vs. IBM debate over quantum supremacy means

img-0066.jpg
The debate is over what it means when you run an actual quantum computer, such as Sycamore, and compare it to a simulation of that quantum computer inside of a classical, electronic computer. Quantum simulation software, such as Microsoft's LIQUi|⟩ program, allows a traditional computer to represent a quantum computer in ordinary circuitry, by translating quantum mechanics into mathematical structures, known as matrices of complex numbers (numbers that incorporate both real and imaginary numbers). With simulations, it's possible to compare how long it takes real quantum circuits to produce a given computation, and how long the same computation takes a classical computer to reproduce, by running the matrix math that resembles the functions of the quantum circuit.  Google and IBM are both looking at such simulations, and they're taking different views as to what the comparison means.  Google's point is that Sycamore is a device that does the work it takes millions of conventional processors to simulate. 


Why organizations feel vulnerable to insider attacks


A full 43% of the respondents cited phishing attacks that trick employees into sharing sensitive company information. Some 24% pointed to weak passwords, 15% referred to spear-phishing attacks targeted to specific individuals, and 15% cited orphaned accounts. Data leakage or theft is always a concern for security professionals both from outside and inside the company. Asked which type of data is most vulnerable to insider attacks, 63% of the respondents pointed to customer data, 55% to intellectual property, and 52% to financial data. ... Insider attacks pose enough of a concern that most organizations do have certain tools in place to deal with them. Some 68% of those surveyed said they feel anywhere from moderately to extremely vulnerable to insider attacks. While 49% said they feel they have the right controls to prevent an insider attack, 28% said they do not, and 23% said they were not sure. Most of the respondents use some type of analytics to determine insider threats with 32% relying on activity management and summary reports, 29% on user behavior analytics, 28% on data access and movement analytics, and 14% on predictive analytics.


North Korean malware detected in India's Kudankulam nuclear facility

North Korean malware detected in India's Kudankulam nuclear facility
The Nuclear Power Corporation of India (NPCIL) admitted yesterday that one of the computers at its Kudankulam nuclear power plant (KKNPP) had been attacked by malware. The malware, however, did not affect the critical internal network of the plant, NPCIL claimed, but the company only confirmed the attack following strong denials.  "Identification of malware in NPCIL system is correct," A.K. Nema, Associate Director and Appellate Authority, NPCIL, belated admitted in a statement. "The matter was conveyed by CERT-In [Indian Computer Emergency Response Team] when it was noticed by them on 4 September 2019," he added. According to Nema, the matter was investigated by DAE cyber security specialists, who found that the compromised computer was connected to the internet and was being used only for administrative work only. He also added the virus infection was isolated from the critical internal network of the plant. A day earlier, KKNPP senior official R Ramdoss had rejected social media reports, which claimed that domain controller-level access at KKNPP has been compromised.


Four principles for security metrics

Metrics come into their own when they act first as a tool to help people understand what’s going on, what you need to do to improve, then how to track progress and measure success. Start by creating one metric per process – then, if this metric goes out of tolerance, you’ll have a clear idea of how to address this. For example, “number of high severity vulnerabilities” is not easily actionable. If this goes beyond your tolerance what action do you take? It’s affected by multiple processes as well as circumstances beyond your control. As metrics are developed and you get new views on what your data is telling you, you’ll likely uncover things that are slipping between the cracks of existing processes. If you have a metric that tracks the current status of a given process, but you require several projects to deal with legacy issues that existed before the process was created/updated, there is no harm in splitting that historical data out and tracking those remediation projects separately. This will help you avoid the situation where your metric confuses good current performance with past problems that are now under management via a different – and possibly longer term – process.


The 10+ Most Important Job Skills Every Company Will Be Looking For In 2020


There's no shortage of information and data, but individuals with the ability to discern what information is trustworthy among the abundant mix of misinformation such as fakes news, deep fakes, propaganda, and more will be critical to an organization's success. Critical thinking doesn’t imply being negative; it’s about being able to objectively evaluate information and how it should be used or even if it should be trusted by an organization. Employees who are open-minded, yet able to judge the quality of information inundating us will be valued. ... Technical skills will be required by employees doing just about every job since digital tools will be commonplace as the 4th industrial revolution impacts every industry. Artificial intelligence, Internet of Things, virtual and augmented reality, robotics, blockchain, and more will become a part of every worker's everyday experience, whether the workplace is a factory or law firm. So, not only do people need to be comfortable around these tools, they will need to develop skills to work with them.


How schools can better protect themselves against cyberattacks


Schools often are more vulnerable to cyberattacks in comparison with larger companies and enterprises, and for a variety of reasons. Many school districts may have only one or two IT people to serve the entire district, so the staffers are spread thin. Budget constraints have affected many schools, limiting the amount of money they can spend on security solutions. Most schools likely have the necessary security set up on individual computers and even the overall network. But comprehensive perimeter protection may not be in place, potentially leading to data breaches and malware hosted on the school's website. Young students don't necessarily have the skills or training to adequately identify phishing emails and other threats, so such attacks are often more successful. The number of tablets and other devices issued by schools has increased in recent years and because of that, students may use those devices on outside networks that aren't secure, thereby raising the risk of infection. Even in the face of budget constraints and other limitations, schools should have adequate security measures in place to protect themselves, their data, and their students from security threats.


AI’s ‘most wanted’: Which skills are adopters most urgently seeking?


When we compare companies with relatively little AI experience (they’ve built five or fewer production systems) with those possessing extensive AI experience (they’ve built 20 or more production systems), we observe an interesting shift in “most wanted” roles (see chart). Early on, AI researchers are the most sought-after, with about a third of the less-experienced rating them a top-two needed role. Business leaders rank near the bottom. By the time adopters have become highly experienced at building AI solutions, business leaders have bubbled to the top, and AI researchers have sunk almost to the bottom. What can we make of this curious flip? Many companies embarking on AI initiatives may feel they need to hire AI superstars—researchers with advanced degrees who can invent new AI algorithms and techniques—to spearhead their efforts.2 By the time organizations have amassed a lot of AI experience, they may have filled their ranks with enough of these brilliant experts. At that stage, they’re eager to find business leaders who can play the crucial “translator” role: figuring out what the results from AI systems mean, and how they should factor into business decisions and actions.


The Role of CIO and How It is Being Rewritten


Digitization spurs new priorities – alongside a full slate of historical departmental responsibilities. In enterprise tech speak, leverage simply means using tools, systems or techniques to convert relatively small effort into significantly greater output. Digital transformation won't fit nicely alongside traditional management processes or cleanly under one leader’s org chart. IT leaders have to get more. More out of themselves, their teams and their dollars to succeed in the new enterprise era. One industry survey shares that at least 84% of top CIOs are now responsible for areas outside of traditional IT. The most common areas are innovation and transformation. Further research reveals that 95% of CIOs expect digitization to change or remix their job. Regardless of where or how new responsibilities intersect business and technology, IT will have a key role to play. Yet, without suitable systems leverage, the CIO position is challenging. How else can they handle the burden of IT consumerization, mobile workforces, big-data challenges, shadow IT and cost management.



AI capabilities power global IoT adoption

Story image
AIoT is defined as decision making aided by AI technologies in conjunction with connected IoT sensor, system or product data. AI technologies include deep learning, machine learning, natural language processing, voice recognition and image analysis. According to the survey, 34% of respondents said increasing revenue is the top goal for using AIoT. Improving the ability to innovate (17.5%), offering customers new digital services (14.3%) and decreasing operational costs (11.1%) were all key goals. Intel Americas chief data scientist Melvin Greer says AI and IoT are no longer separate technologies. “AI closes the loop in an IoT environment where IoT devices gather or create data, and AI helps automate important choices and actions based on that data,” explains Greer. “Today, most organisations using IoT are only at the first ‘visibility’ phase where they can start to see what’s going on through IoT assets. But they’re moving toward the reliability, efficiency and production phases, which are more sophisticated and require stronger AI capabilities.”


Defense Innovation Board unveils AI ethics principles for the Pentagon

U.S. Pentagon
Applied Inventions cofounder and computer theorist Danny Hillis and board members agreed to amend the draft document to say the governable principle should include “avoid unintended harm and disruption and for human disengagement of deployed systems.” The report, Hillis said, should be explicit and unambiguous that AI systems used by the military should come with an off switch for a human to press in case things go wrong. “I think this was the most problematical aspect about them because they’re capable of exhibiting and evolving forms of behavior that are very difficult for the designer to predict, and sometimes those forms of behavior are actually kind of self preserving forms of behavior that can get a bit out of sync with the intent and goals of the designer, and so I think that’s one of the most dangerous potential aspects about them,” he said. The Defense Innovation Board is chaired by former Google CEO Eric Schmidt, and members include MIT CSAIL director Daniela Rus, Hayden Planetarium director Neil deGrasse Tyson, LinkedIn cofounder Reid Hoffman, Code for America director Jennifer Pahlka, and Aspen Institute director Walter Isaacson.



Quote for the day:


"It's not about how smart you are--it's about capturing minds." -- Richie Norton


Daily Tech Digest - October 30, 2019

Automation projects: A good time to switch vendors?

automation iot machine learning process ai artificial intelligence by zapp2photo getty
Many network infrastructure vendors are developing automation technology aimed primarily, if not solely, at their own products, rather than multi-vendor environments. While most enterprises use two or three different automation tools in their initiatives, 42 percent say that an automation tool aimed at a single vendor is part of their strategy. In fact, 26 percent said a single-vendor automation tool is the most important part of their automation technology strategy. ... The most important ZTP feature, according EMA’s survey, is software-image auto-updates and verifications. Many enterprises are also interested in being able to custom provision and configure devices via scripts and the ability to unify ZTP network provisioning with compute and storage infrastructure in data centers. Not every network vendor offers embedded ZTP features on their platforms, and most only offer them on their latest generation products. Enterprises with older equipment may switch to a new vendor during a refresh, and ZTP features may be a contributing or leading driver of that vendor switch.


Joker's Stash Lists 1.3 Million Stolen Indian Payment Cards

Group-IB, which has analyzed the cards listed for sale, says more than 98 percent appear to have been issued by Indian banks, with a single bank accounting for more than 18 percent of all of the dumps. About 1 percent of the cards appear to have been issued to Columbian banks. What's unusual about this sale is that so many payment cards have been uploaded at once. "Databases are usually uploaded in several smaller parts at different times," says Ilya Sachkov, CEO and founder of Group-IB, which was originally headquartered in Moscow. While that is unusual, so too is the sheer scale of what's being offered all at once. "This is indeed the biggest card database encapsulated in a single file ever uploaded on underground markets at once," he says. "What is also interesting about this particular case is that the database that went on sale hadn't been promoted prior either in the news, on card shop or even on forums on the dark net. The cards from this region are very rare on underground markets. In the past 12 months, it is the only one big sale of card dumps related to Indian banks."


Kubernetes vs. Docker: Understand containers and orchestration
Containers are designed chiefly to isolate processes or applications from each other and the underlying system. Creating and deploying individual containers is easy. But what if you want to assemble multiple containers—say, a database, a web front-end, a computational back-end—into a large application that can be managed as a unit, without having to worry about deploying, connecting, managing, and scaling each of those containers separately? You need a way to orchestrate all of the parts into a functional whole. That’s the job Kubernetes takes on. If containers are passengers on a cruise, Kubernetes is the cruise director. Kubernetes, based on projects created at Google, provides a way to automate the deployment and management of multi-container applications across multiple hosts, without having to manage each container directly. The developer describes the layout of the application across multiple containers, including details like how each container uses networking and storage. Kubernetes handles the rest at runtime. It also handles the management of fiddly details like secrets and app configurations.


The effect of having computer systems wirelessly or directly transmit data to the brain isn't known, but related technologies such as deep brain stimulation -- where electrical impulses are sent into brain tissue to regulate unwanted movement in medical conditions such as dystonias and Parkinson's disease -- may cause personality changes in users.  And even if BCIs did cause personality changes, would that really be a good enough reason to withhold them from someone who needs one -- a person with paraplegia who requires an assistive device, for example? As one research paper in the journal BMC Medical Ethics puts it: "the debate is not so much over whether BCI will cause identity changes, but over whether those changes in personal identity are a problem that should impact technological development or access to BCI". Whether regular long-term use of BCIs will ultimately effect users' moods or personalities isn't known, but it's hard not to imagine that technology that plugs the brain into an AI or internet-level repository of data won't ultimately have an effect on personhood.


With communication, previous attempts used infrared lights or radio waves, but if you have many robots in a small area, these signals can conflict. The MIT team instead created a cube devoid of arms, using inertial forces to move the robots. These forces are the result of a mass inside each cube that throw themselves against the side of the module, causing the block to rotate or move in 24 different directions, with there being six faces, the paper added.  "There's a relatively large field of other people building sort of similar robots," Romanishin said, "But the two main unique parts about our robots are how they move, which is using angular momentum from what we call a reaction wheel, and the way it uses magnets. It uses them in a special way that is potentially a really scalable and cheap solution for identifying hundreds of thousands of elements in a small space." "One of the big things that we looked at was how do you make the robots move relative to each other? It's a really challenging, from a design standpoint and a physics standpoint," Romanishin added. 


Regression testing process
In simple terms, regression testing can be defined as retesting a computer program after some changes are made to it to ensure that the changes do not adversely affect the existing code. Regression testing increases the chance of detecting bugs caused by changes to the application. It can help catch defects early and thus reduce the cost to resolve them. Regression testing ensures the proper functioning of the software so that the best version of the product is released to the market. However, creating and maintaining a near-infinite set of regression tests is not feasible at all. This is why enterprises are focusing on automating most regression tests to save time and effort. ... Whenever there is a change in the app or a new version is released, the developer carries out these tests as a part of the regression testing process. First, the developer executes unit-level regression tests to validate the code that they have modified along with any new test that is created to cover any new functionality. Then the changed code is merged and integrated to create a new build of AUT. After that, smoke tests are performed to assure that the build that we have created in the previous step is good before any additional testing is performed.


Object storage in the cloud: Is backup needed?

CSO > cloud computing / backups / data center / server racks / data transfer
How the replication works is also very different. Object replication is done at the object level vs the block-level replication of cloud block storage and typical RAID systems. Objects are also never modified. If an object needs to be modified it is just stored as a new object. If versioning is enabled, the previous version of the object is saved for historical purposes. If not, the previous version is simply deleted. This is very different from block storage, where files or blocks are edited in place, and the previous versions are never saved unless you use some kind of additional protection system. Cloud vendors offer object-storage services, which include Amazon's Simple Storage Service (S3), Azure’s Blob Store, and Google’s Cloud Storage. These object-storage systems can be set up to withstand even a regional disaster that would take out all availability zones. Amazon does this using cross-region replication that must be configured by the customer. Microsoft geo-redundant storage includes replication across regions, and Google offers dual-region and multi-region storage that does the same thing.


Massive Cyberattack Slams Country of Georgia

Massive Cyberattack Slams Country of Georgia
One obvious potential culprit for the attacks against Georgia would, of course, be Russia, which has previously launched politically motivated cyberattacks against the government sectors of former Soviet states, including Estonia. Georgia is a U.S. ally, and since 2011, it has been an "aspirant country" in terms of its potential membership in NATO. It's also been engaged in a months-long spat with Moscow. After a Russian legislator's address to the Georgian parliament triggered protests, Georgia on June 20 temporarily blocked all flights originating from Russia. In response, Russian President Vladimir Putin on June 21 ordered that starting July 8, Russian carriers were barred from operating flights between Russia and Georgia. The Monday cyberattack against Georgia echoes cyberattacks launched against the country in 2008, weeks before the country was invaded by Russia over Georgia's "breakaway provinces" of South Ossetia and Abkhazia. At the time, Moscow said it wasn't responsible for the cyberattacks, but it suggested that some Russian individuals may have been independently involved.


Lies programmers tell themselves

9 lies programmers tell themselves
Figuring out how to handle null pointers is a big problem for modern language design. Sometimes I think that half of the Java code I write is checking to see whether a pointer is null. The clever way some languages use a question mark to check for nullity helps, but it doesn’t get rid of the issue. A number of modern languages have tried to eliminate the null testing problem by eliminating null altogether. If every variable must be initialized, there can never be a null. No more null testing. Problem solved. Time for lunch. The joy of this discovery fades within several lines of new code because data structures often have holes without information. People leave lines on a form blank. Sometimes the data isn’t available yet. Then you need some predicate to decide whether an element is empty. If the element is a string, you can test whether the length is zero. If you work long and hard enough with the type definitions, you can usually come up with something logically sound for the particular problem, at least until someone amends the specs. After doing this a few times, you start wishing for one, simple word that means an empty variable.


Categorise Unsolved Problems in Agile Development: Premature & Foreseeable

Unsolved problems belong on the backlog. In theory, the Product Owner processes all backlog items, dismisses the irrelevant and prioritizes the most important ones into sprints, until the backlog is empty and the project is done. But in practice, that’s not what happens. The backlog just grows forever. It collects items that can wait, together with technical debt and hot potatoes which cannot simply be dismissed. To developers, the backlog is a spillway to keep their job doable. Agile says: whatever you don't know yet, or can do without for now, park it on the backlog, and forget about it. It will reemerge when needed. For the most part, this works. It is the power of Agile. But by the time unsolved problems reemerge, hot potatoes have become too hot to handle, and technical debt has become too expensive to repay. Implementation effort has grown far beyond the available resources. This can be prevented by adding some core insights and making a few small but essential changes to the Agile approach.



Quote for the day:


"Leaders dig into their business to learn painful realities rather than peaceful illusion." -- Orrin Woodward