Daily Tech Digest - March 23, 2018

Google Enemy
There are countless other examples of Google acting as its own worst enemy and failing to follow through with a commendable initial vision. Look at the company's never-ending messaging mess, for instance, or the awkward implementation of Apple-like app shortcuts in Android 7.1. In the latter case, as I said at the time, "instead of thinking through what'd be the most sensible and user-friendly way for a feature like this to work, Google seemed to just emulate the way Apple did it." See the pattern? To a degree, a company being flexible and open to the evolution of its products — even when said transformation blatantly revolves around "borrowing" inspiration from other sources — can be an asset. But there's also something to be said for having the stones to stand by the value of your own ideas and remaining willing to recognize when you've got a good thing going, even if that thing requires a mix of refinement and promotion to reach its potential.


multiple-exposure image of FinTech symbols, laptop, circuit board, and a dollar bill
Starting in May, GDPR will force European banks to rethink how they store, manage, use and disseminate personally identifiable information, according to the report. "If they wish to partake in blockchain-based AML and EFM device, whitelist, and transactional data sharing, [financial institutions] must adapt their privacy policies and tools to be able to cope with this requirement," Forrester said. ... AML and EFM are harder than ever to enforce and need to rely on the most diverse data possible, Forrester said, adding that "verifying identities before allowing them to transact helps avoid fraud losses in a complex payment ecosystem." That's where blockchain can be useful. Because it is an immutable, auditable electronic record, blockchain ensures that transaction records contain artifacts and identifiers of previous transactions. "This allows authorized investigators to backtrack transactions on the blockchain more easily than with current AML and EFM systems," Forrester said.


Java security issues are real. Java was designed to be as secure as most other popular programming languages, and it offers features like SecurityManager to help improve security in certain contexts. However, Java applications are subject to a number of potential security vulnerabilities, including, but not limited to, various injection attacks. It's crucial for Java developers and administrators to keep common Java security vulnerabilities in mind as they write and deploy Java applications. Security-first programming is especially important in the case of Java because the cross-platform nature of Java code means that OS-level security frameworks can't always be trusted to keep applications secure. Nor should you expect end users to be able to manage Java security threats effectively. Sure, you can blame your users for running untrusted Java code or disabling automatic updates to their Java runtimes, but ultimately, the burden of writing secure Java applications and isolating code within a Java environment that might not be secure lies with developers.


Short sprints vs. big bang: the best way to adopt the cloud
When it comes to cloud adoption, large enterprises and government agencies focus on quick wins, using quick sprints, and are typically more successful than those companies that try to drive huge change over a longer period of time, aka the big-bang approach. ... The objectives of each may be exactly the same—to migrate most of the enterprise’s workloads—but the short-sprint approach is ten times more likely to demonstrate success and thus value than the big-bang approach.  The short-sprint approach also aligns with the typical corporate culture. People think in small tactical ways versus large and strategic, so expectations are for small, quick wins. The larger, longer strategic wins simply are not as valued by the executives and investors in the standard corporate culture. The big-bang approach can and does work—if the company can hold to its commitment that long and doesn’t need ROI along the way. 


4 best practices for automating mobile app testing

How can you test your mobile apps on a diversity of mobile devices, and in geographic locales that you're not even thinking of? This is the problem that BrowserStack, and other companies like it, set out to solve. These companies offer cloud-based test automation software and processes that enable you to test your app in virtually any simulated mobile device environment and in any test scenario. The test automation can eliminate steps for your QA staff, such as checking out app navigation, displays of data and images, and even data access, retrieval and update to a database. "Our goal was to develop tools that could automate as much of the diverse mobile app test process as possible, to save time, and to speed these apps to market," said Rao. "We were also aware of the mobile developer shortage in enterprises, and the fact that many companies can't secure the mobile app development talent that they need. Consequently, they have to find other ways to speed app development and test, like automating more of the process."


Intel To Release Most Powerful Mainstream Processor Ever To Beat AMD


Intel largely succeeded here, and thanks to higher clock speeds (and unfortunately a higher price too) on the CPU compared to AMD's equivalents price-wise, the Core i7-8700K was faster in many tests despite a two core deficit. However, where raw multi-threaded performance is concerned, especially in benchmarks that aren't otherwise Intel-optimised, AMD gained the upper hand meaning that Intel can't quite claim to be faster in everything. This is what it's looking to address with the new CPU, because even with better boosting algorithms, AMD's soon-expected Ryzen second generations CPUs, due next month, probably won't be able to match something akin to a Core i7-8700K, but with eight cores. Interestingly, AMD's 12-core Threadripper 1920X retails for $670, which means that there's a big gap for Intel to play with price-wise with its new eight-core CPU. The Core i7-8700K retails for around $350, so even if the new CPU costs $500, it's potentially going to blur the lines between both AMD and Intel's high-end desktop platforms.


How Microsoft and Databricks crafted a unique partnership for AI data processing

Using Azure Databricks, customers can take in data ingested through other services, prepare it, and process it using machine learning algorithms and other techniques. After that, it can be funneled out to other services like Cosmos DB and Power BI. Making a deep integration possible required a great deal of work on the part of both firms, however. Company representatives made many trips back and forth between Databricks’ office in San Francisco and Microsoft’s in Redmond. The partnership wasn’t without its challenges on either end, but both companies were committed to it for the sake of their joint customers. Andreessen Horowitz cofounder Ben Horowitz, who sits on Databricks’ board, has a close relationship with Microsoft CEO Satya Nadella and helped facilitate the two companies’ collaboration. Microsoft had to work through concerns about what it would mean for the company to deeply integrate Databricks with Azure systems, including those for incident management, handling support requests, and other functions. 


Why the Waterfall or Agile debate will be around forever


"What's wrong is to assume one or the other is always the answer," Raschke said. "When you look at a product and it needs flexibility [or] the client is not sure what they want, what the market needs or there is innovation required, Scrum is important. If it is embedded or mission-critical software, Waterfall might be a better choice." What phase the project is on also determines which methodology a team should use. Business case development, choice of infrastructure and decisions on deployment and maintenance are all aspects of the software development lifecycle that don't necessarily lend themselves to the Agile approach. When forced to align with either Waterfall or Agile, these business cases tend to fall more in line with Waterfall. Once those outlines are in place, the actual development might trend more toward either Waterfall or Agile, depending on the purpose and nature of the project. Raschke said to remember what is most important. "The question is: What do we need to do to get it to the customer? In that case, it usually ends up as hybrid."


CISCO, Verizon Take Information-Centric Networking For a Real World Spin

mobile wireless network
Cisco has developed what it calls Hybrid ICN (hICN), which enables the deployment of ICN within IP rather than as an overlay or replacement of IP. It preserves all features of ICN communication by encoding ICN names into IP addresses, according to Giovanna Carofiglio, a Cisco Distinguished Engineer.  “hICN supports IPv4- or IPv6-RFC compliant packet formats, and guarantees transparent interconnection with standard IP networking equipment, simplifying the insertion of ICN technology in existing IP infrastructure and enabling coexistence with legacy IP traffic,” Carofiglio said. Cisco and Verizon expect that hICN will become a strong technology for 5Genvironments in that ICN adoption may dramatically simplify next generation network architecture by offering a unified content-aware and access-agnostic network substrate for the integration of heterogeneous networks, Carofiglio said. It is the hICN technology Verizon tested in its lab recently.



Optimising the smart office: A marriage of technology and people

A strong digital culture is clearly a positive thing, but there's room for improvement: the percentages of employees in strong-digital-culture businesses who rate themselves highly on empowerment (47%), innovativeness (39%) and -- in particular -- productivity (22%) might be expected to be higher, for example. Management consultancy McKinsey has recently suggested that productivity benefits from digitisation have yet to materialise at scale for several reasons, including "lag effects due to the need to reach technological and business readiness" and "costs associated with the absorption of management's time and focus on digital transformation". Another KPI Microsoft examined was engagement, or 'flow' -- the ability for workers to focus on the task at hand and deliver a better end result more efficiently. Overall, just 20 percent of respondents felt highly engaged at work, but there was a fourfold difference between engagement levels in businesses with strong versus weak digital cultures.



Quote for the day:


"Rarely have I seen a situation where doing less than the other guy is a good strategy." -- Jimmy Spithill


Daily Tech Digest -March 22, 2018

Five Pillars of Data Governance: Initiative Sponsorship

Initiative Sponsorship Data Governance GDPR
It’s an on-going initiative that requires active engagement from executives and business leaders. But unfortunately, the 2018 State of Data Governance Report finds lack of executive support to be the most common roadblock to implementing DG. This is historical baggage. Traditional DG has been an isolated program housed within IT, and thus, constrained within that department’s budget and resources. More significantly, managing DG solely within IT prevented those in the organization with the most knowledge of and investment in the data from participating in the process. This silo created problems ranging from a lack of context in data cataloging to poor data quality and a sub-par understanding of the data’s associated risks. Data Governance 2.0 addresses these issues by opening data governance to the whole organization. Its collaborative approach ensures that those with the most significant stake in an organization’s data are intrinsically involved in discovering, understanding, governing and socializing it to produce the desired outcomes. 


How Serverless Computing Reshapes Security

First and foremost, serverless computing, as its names implies, lowers the risks involved with managing servers. While the servers clearly still exist, they are no longer managed by the application owner, and are instead taken care of by the cloud platform operators — for instance, Google, Microsoft, or Amazon. Efficient and secure handling of servers is a core competency for these platforms, and so it's far more likely they will handle it well. The biggest concern you can eliminate is addressing vulnerable server dependencies. Patching your servers regularly is easy enough on a single server but quite hard to achieve at scale. As an industry, we are notoriously bad at tracking vulnerable operating system binaries, leading to one breach after another. Stats from Gartner predict this trend will continue into and past 2020. With a serverless approach, patching servers is the platform's responsibility. Beyond patching, serverless reduces the risk of a denial-of-service attack


Google parent's free DIY VPN: Alphabet's Outline keeps out web snoops

aphabetjigsawoutlinevpn.png
Outline promises to solve the double-edged sword of VPN services. There are loads of free VPN services, which in theory can protect sensitive information when using a public Wi-Fi network. However, as ZDNet's David Gewirtz has pointed out, you probably shouldn't entrust these with digging an encrypted tunnel between your computer and another machine. An alternative option is to pay around $120 a year for a VPN service, but again this requires trusting the provider and weighing up the jurisdiction it operates in. Outline offers journalists a cheaper way to set up their own VPN server on any cloud provider or on their own hardware, cutting out the need to trust a third party. "Outline gives you control over your privacy by letting you operate your own server. And Outline never logs your web traffic," Jigsaw product manager Santiago Andrigo wrote. "We made it possible to set up Outline on any cloud provider or on your own infrastructure so you can fully own and operate your own VPN and don't have to trust a VPN operator with your data."


Hexagonal Architecture as a Natural Fit for Apache Camel


Let's look at the two extremes: a layered architecture manages the complexity of a large application by decomposing it and structuring it into groups of subtasks of particular abstraction levels called layers. Each layer has a specific role and responsibility within the application and changes made in one layer of the architecture usually don't affect the components of other layers. In practice, this architecture splits an application into horizontal layers, and it is a very common approach for large monolithic web or ESB applications of the JEE world. On the other extreme is Camel, with its expressive DSL and route/flow abstractions. Based on the Pipes and Filters pattern, Camel would divide a large processing task into a sequence of smaller, independent processing steps (Filters) connected by a channel (Pipes). There is no notion of layers that depend on each other, and, in fact, because of its powerful DSL, a simple integration can be done in a few lines and a single layer only. In practice, Camel routes split your application by use case and business flow into vertical flows rather than horizontal layers.


Do Facebook Users Really Care About Online Privacy?

Facebook-Cambridge Analytica row: Do Facebook users really care about online privacy?
Facebook, since 2014, has a platform policy that clearly states what developers of third party apps can and cannot do. With regards to data, third party apps have to elaborate in their privacy policy on what data they are collecting and how they plan to use that data. These third-party apps must also delete any data received from Facebook. What Facebook also does now is moderate third party apps. Apps go through a review process where they must justify why that information is necessary for the app. Facebook characterizes "detailed information" as anything other than a user's friends, public profile, and email. Approval is only granted if apps can show that the information they are requested will be directly used. But Facebook's updated platform policy only came into place a year after the "Cambridge University researcher named Aleksandr Kogan had created a personality quiz app, which was installed by around 300,000 people who shared their data as well as some of their friends' data," as revealed by Zuckerberg in the public post.



Wide area Ethernet can fuel digital network transformation

Enter wide area Ethernet (WAE) services. WAE is a technology that has been around a long time but never gained the same level of adoption as other network services such as MPLS or consumer-type broadband services. In some ways Ethernet has always been a solution looking for a problem as its attributes didn't align cleanly with the challenges most businesses faced. Indeed, the network was considered by many to be a commodity -- the basic plumbing if you will -- where the information being transported was best-effort in nature. Because of this, network managers and procurement officers just went with what they knew, even though it was often considerably more expensive. DX is the problem that Ethernet and WAE have been waiting for. WAE directly addresses the business problems faced by digital organizations. ... Data continues to grow at exponential rates; 90% of all data that exists today, in fact, has been created in the past two years, according to ZK Research. IoT, video, mobile services and other data will only continue to add to the glut. 


9 machine learning myths
Machine learning is proving so useful that it's tempting to assume it can solve every problem and applies to every situation. Like any other tool, machine learning is useful in particular areas, especially for problems you’ve always had but knew you could never hire enough people to tackle, or for problems with a clear goal but no obvious method for achieving it. Still, every organization is likely to take advantage of machine learning in one way or another, as 42 percent of executives recently told Accenture they expect AI will be behind all their new innovations by 2021. But you’ll get better results if you look beyond the hype and avoid these common myths by understanding what machine learning can and can’t deliver. ... Think of it as anything that makes machines seem smart. None of these are the kind of general “artificial intelligence” that some people fear could compete with or even attack humanity. Beware the buzzwords and be precise. 


networksec.jpg
As organizations matured, they could use the cloud control plane tools to create NAC rules. While the interface required training, the concepts were similar. Traffic from one set of hosts was allowed or disallowed. However, the cloud security control plane does represent one of the first early challenges in hybrid IT security—a consistent operations control plane. As hybrid IT services become more complex, security professionals required more granular controls between the public cloud and private infrastructure. Take the universal example of the web and application tiers in a three-tier application as an example. Merely creating a firewall rule that allows traffic from the web-tier to the application-tier proved complex. Early private data center firewalls lacked the context of ephemeral cloud security objects. If the web-tier leveraged elastic compute, the public cloud administrator had to ensure that auto-scaled web servers were all created in the same network scope for the static firewall to properly filter traffic.


What would a regulated-IoT world look like?

IoT security hero image
Perhaps the most useful contrast to the U.S.’s lack of regulatory attention to IoT security issues is Europe, where the General Data Protection Regulation has provoked howls of outrage from the tech industry, but won praise from privacy rights advocates. GDPR, in essence, places the burden on companies to state clearly and up-front what types of user data will be gathered, and precisely what it will be used for. It also gives users the right to see data that has been collected about them, and to correct inaccuracies. It’s not wildly dissimilar to the most stringent data protection law currently on the books in the U.S. – the Health Insurance Portability and Accountability Act, better known as HIPAA. According to Sadeh, a more broad-based privacy protection law in the U.S., designed to address the threats posed by IoT and other technologies that have badly outstripped existing regulations, could easily resemble HIPAA with greater scope.



Google Is Working on Its Own Blockchain-Related Technology

The technology presents challenges and opportunities for Google. Distributed networks of computers that run digital ledgers can eliminate risks that come with information held centrally by a single company. While Google’s security is strong, it’s one of the largest holders of information in the world. The decentralized approach is also beginning to support new online services that compete with Google. Still, the company is an internet pioneer and has long experience embracing new and open web standards. To build its ledger, Google has looked at technology from the Hyperledger consortium, but it could opt for another type that may be easier to scale to run millions of transactions, one of the people familiar with the situation said. "Any time there’s a paradigm shift like this, there’s an opportunity for new giants to emerge -- but also for incumbents to adopt the new approach," said Elad Gil, a startup investor who worked on early mobile projects at Google more than a decade ago.



Quote for the day:


"Do not lose hold of your dreams or aspirations. For if you do, you may still exist but you have ceased to live." -- Thoreau


Daily Tech Digest - March 21, 2018

AI outpaces lawyers in reviewing legal documents


For the study, the lawyers and the LawGeex AI had to analyse five previously unseen contracts with 153 paragraphs of technical legal language, under controlled conditions precisely prepared the way lawyers review and approve everyday contracts. The highest performing lawyer stood in line with LawGeex AI by achieving 94% accuracy but the average accuracy achieved by the least performing lawyer stood at just 67%. The most notable difference in the test between machines and humans lies in the time factor: while it took LawGeex AI only 26 seconds to complete the task, the lawyers took average of 92 minutes. The longest time spent by the humans to accomplish the test was 156 minutes and the shortest time recorded was 51 minutes. Commenting on the study, Gillian K. Hadfield, Professor of Law and Economics at the University of Southern California said: “This research shows technology can help solve two problems – both making contract management faster and more reliable, and freeing up resources so legal departments can focus on building the quality of their human legal teams.”



Are You Just Keeping the Lights On in Your Datacenter?

The notion that IT struggles to move beyond their traditional role and into a more innovative one is very common. But, as the IDC statistic shows, IT is more often a cost center, rather than a source of innovation and revenue for the company. Why is this situation still so widespread? A core issue is that nearly everything in the datacenter is manual and not automated. Most datacenters have custom configurations that require their own manual maintenance with specialized tools. Incremental progress on any one or two help, but it isn’t enough to substantially change the big picture for the company. Over time, people get used to this status quo and start to think that it is completely normal. They fall into the trap of believing that a huge step forward toward automation and innovation is impossible.



Top 10 open source legal stories that shook 2017

Top 10 open source legal stories
In February 2017, GitHub announced it was revising its terms of service and invited comments on the changes, several of which concerned rights in the user-uploaded content. The earlier GitHub terms included an agreement by the user to allow others to "view and fork" public repositories, as well as an indemnification provision protecting GitHub against third-party claims. The new terms added a license from the user to GitHub to allow it to store and serve content, a default "inbound=outbound" contributor license, and an agreement by the user to comply with third-party licenses covering uploaded content. While keeping the "view and fork" language, the new terms state that further rights can be granted by adopting an open source license. The terms also add a waiver of moral rights with a two-level fallback license, the second license granting GitHub permission to use content without attribution and to make reasonable adaptations "as necessary to render the website and provide the service."


Descriptive Statistics: The Mighty Dwarf of Data Science

Consider a case where a monitoring system is to detect anomalies within the data. Typically, one may turn into the classic means of outlier analysis like the DBSCAN-based approaches or LOF. Nothing wrong with these, they may perfectly well point towards the directions where the outliers may be present. However, these techniques may require substantial computational resources to complete the task on high volumes of data in reasonably acceptable amount of time. A much faster alternative may come from considering the given case as a time series analysis problem. Such data coming from a system operating in ‘healthy’ conditions would have a typical, acceptable amplitude distribution and, in such scenario, any deviation from the expected shape may be considered a potential threat, worth detecting. A very fast descriptive statistic aimed at summarizing the shape of the distribution of the signal is called the ‘kurtosis’.


What Are the Limits of Forensic Data Retention?


Internet Service Providers will retain IP addresses of customers and the servers that they connect to. Armed with this information, forensic investigators can determine which websites suspects are accessing. However, most ISPs do not keep records of the actual content their subscribers access. There are a couple of reasons for this. First of all, keeping records of all content would be far more demanding on their servers. They simply don’t have the resources, even in the age of big data. Even if they wanted to keep these records, it would be impossible to see what content customers are accessing on most websites. Most websites have encrypted connections, so Internet Service Providers can’t tell what their users are doing on them. For example, since Facebook uses HTTPS connections, Internet service providers can’t read the customers’ messages or seewhat content they post on their Facebook feed. Nor can they see what they are searching for on Google.


AI key to do 'more with less' in securing enterprise cloud services

Artificial intelligence (AI), machine learning (ML), and predictive analytics applications may one day prove to be the key to maintaining control and preventing successful hacks, data breaches, and network compromise. These technologies encompass deep learning, algorithms, and Big Data analysis to perform a variety of tasks. The main goal of AI and ML is usually to find anomalies in systems and networks of note, whether it be suspicious traffic, unauthorized insider behavior and threats, or indicators of compromise. Able to evolve over time, the purpose of AI technologies is to learn, detect, and prevent suspicious and dangerous activities with improvements and refinements the longer such applications and systems are in use. This provides companies with a custom cybersecurity system which tailors itself to their requirements, in comparison to an off-the-shelf, traditional antivirus security solution -- which is no longer enough with so many threats lurking at the perimeter.


What is the leading IT cost-optimisation priority for CIOs?

IT Cost CIOs
This bolsters Gartner’s opinion that the most successful organisations are more likely to trust their IT organisation to manage their IT and digital technology spending. Respondents were also questioned about who manages the selection and approval of cost optimisation ideas. Those with visibility of both the IT shared services budget and all digital spending across the organisation reported that, on average, nearly half of their digital technology spending is paid for by the business. A quarter is paid for out of the IT budget, with chargeback to the business. “As you’d expect, CIOs have the most influence over the selection and approval of cost optimisation opportunities within IT shared services,” said Buchanan. “Interestingly, CIOs who focus on digital business opportunities have greater responsibility for cost optimisation than those who don’t. This suggests that CIOs are starting to exert influence over selecting and approving digital business ideas to optimise business costs.”


IoT in Healthcare: Balancing Patient Privacy & Innovation

Medical technology tends to lag behind other technologies, as the cost of mistakes at medical practices and hospitals can be astronomical. As a result, the field can lag behind when it comes to adopting the latest digital or IoT technologies. Patient privacy is a major issue, so all new technologies must be adopted carefully while adhering to various data compliance obligations that apply both to companies in general as well as healthcare organisations specifically. Managing clinics and hospitals is complex, and it’s expensive. Many healthcare organizations rely on multiple computer and networking systems. Through smart bracelets, administrators can better track patient movement, and they can determine how often patients meet with their doctors. In addition, IoT technology can make it easier to track and analyze patients’ vital signs and other metrics, offering invaluable feedback and resolution not possible with manually measurements. 


Organizing for digital industrial leadership

Organizing for digital industrial leadership
Digital industrial leadership is transforming the industrial world. For BHGE, specifically, data and analytics are fundamentally changing the way work gets done in our business and in the oil and gas industry as we prepare for the next big step-change in productivity. When I think of digital industrial leadership, I think about using data to move from looking in the rear-view mirror to looking into the future. We are beyond making decisions based on order history; we are using data to be predictive and make recommendations for future sales targets. As an example, we are using artificial intelligence on the shop floor to understand what drives disruptive, unscheduled downtime of our welding machines. When information technology meets operations technology, we learn what behaviors or indicators lead up to that unplanned downtime. We can use predictive analytics to do preventive maintenance and improve our productivity.


Credit Risk Prediction Using Artificial Neural Network Algorithm

To predict the credit default, several methods have been created and proposed. The use of method depends on the complexity of banks and financial institutions, size and type of the loan. The commonly used method has been discrimination analysis. This method uses a score function that helps in decision making whereas some researchers have stated doubts on the validity of discriminates analysis because of its restrictive assumptions; normality and independence among variables [4]. Artificial neural network models have created to overcome the shortcomings of other inefficient credit default models. The objective of this paper is to study the ability of neural network algorithms to tackle the problem of predicting credit default, that measures the creditworthiness of the loan application over a time period. Feed forward neural network algorithm is applied to a small dataset of residential mortgages applications of a bank to predict the credit default.


Quote for the day:


"Never allow someone to be your priority while allowing yourself to be their option." -- Mark Twain


Daily Tech Digest - March 20, 2018

The future of computer security is machine vs machine

security automation robot protects defends from attack intrusion breach
Because so much of our computing infrastructure will be protected and controlled by well-informed, cloud-based decision makers, the malware and hackers of the future will be forced to fight the centralized services first and foremost if they ever hope to spread. They will probably subscribe to these same services and look for holes, or subscribe to a malicious service that belongs to multiple services and looks for and sells weaknesses, much like some services do today fighting the accuracy of VirusTotal. This is where the future defense and attack scenarios start looking very machine versus machine. Our future defenses will be more centralized, coordinated, and automated. The hackers will have to do the same thing to stay ahead. If they don’t automate as much as or more than the defensive services do, they won’t be able to do as much badness. Hackers and malware will turn to automation and AI just as much as the defenders. When the defenders block the malicious thing that was being successful a few minutes ago, the malicious automated service will have to quickly respond. Whomever’s AI is better will ultimately win.


Empowering Citizen Data Scientists to Solve the AI Skills Shortage


Citizen data scientists are analysts with above average skills but without a formal academic background in data science, explained Ashley Kramer, vice president of Product Management at Alteryx, "I see the citizen data scientists as this emerging group," she said in an interview. "They don't have a degree in data science but they're more advanced than your average analyst. They are people with advanced capabilities like writing scripts within Excel. They are starting to get to that next level of being able to create predictive analytics. They need a little bit of help because they're not programmers." This is the group of what in an earlier era were called power users. Alteryx is designed to help them. "With the data scientist shortage," Kramer said, "we provide a platform that can be used by citizen data scientists in a code-free way, which is really important as they're learning the process." Alteryx offers a "code-friendly" way for budding data scientists to begin creating machine learning models for business requirements such as predicting and preventing equipment failures.


How complexity, multicloud sprawl, and need for maturity hinder hybrid IT


To some degree, we’ve already hit that inflection point where technology is being used in inappropriate ways. A great example of this—and it’s something that just kind of raises the hair on the back of my neck—is when I hear that boards of directors of publicly traded companies are giving mandates to their organization to “go cloud.” The board should be very business-focused and instead they're dictating specific technology, whether it’s the right technology or not. That’s really what this comes down to.  Another example is folks that try and go all in on cloud but aren’t necessarily thinking about what’s the right use of cloud—in all forms: public, private, software as a service (SaaS). What’s the right combination to use for any given application? It’s not a one-size-fits-all answer. We in the enterprise IT space haven't really done enough work to truly understand how best to leverage these new sets of tools. We need to both wrap our head around it but also get in the right frame of mind and thought process around how to take advantage of them in the best way possible.


Enhancing digital infrastructure: Why it matters & the best strategies for your business

Change can be unsettling, and workers can understandably be apprehensive about any differences in role, focus or tasks, especially if it involves a technology element they don’t necessarily understand. For example; if moving to the ‘cloud’ you can’t just ‘buy cloud’ and hope everyone catches on to the concept overnight. No person or training course can transform single-handedly. There’s also a whole range of concepts to become familiar with for teams to work at the speed that cloud can enable. Focus on the move to ‘becoming cloud’ – a gradual process of change – which will transform the culture in a structured and less intimidating way. Help your staff by providing as much clarity as you can; translate top-line goals and priorities into specific metrics and KPIs for employees at all levels. Allow them time and space to experiment with new technology and let them know it’s ok to get things wrong! Encourage users to communicate and feedback so the right support is identified.


The evolution of systems requires an evolution of systems engineers

Nature-kaleidoscope
New frameworks, architectures, processes, and a thriving ecosystem of tools have emerged to help us meet those challenges. Some of these are in an embryonic state, but rapid adoption is driving quick maturity. We’ve seen this evolution in compute: it’s only been four years since containers became a mainstream technology, and we are now working with complex application-level abstractions enabled by tools like Kubernetes. A similar evolution is occuring with deployment, serverless, edge-computing technology, security, performance, and system observability. Finally, no changes can exist in a human and organizational vacuum. We have to develop the leadership skills necessary to build truly cross-functional teams and enable that rapid iteration needed to build these systems. We have to continue the work of the DevOps and SRE communities to break down silos and streamline transitions between teams and increase development velocity.


Surprisingly, These 10 Professional Jobs Are Under Threat From Big Data

Big Data threat to professional jobs (Source: Shutterstock)
When you read or hear news stories about the imminent takeover of robots and algorithms that will eliminate jobs for human workers, many times the first examples given are blue-collar jobs like factory workers and taxi drivers. And you may have mentally congratulated yourself because your “professional” job is safe from the threat of being outsourced to computers. But don’t feel so safe just yet. More and more, sophisticated algorithms and machine learning are proving that jobs previously thought to be the sole purview of humans can be done — as well or better — by machines. Boston Consulting Group has predicted that by 2025 as much as a quarter of jobs currently available will be replaced by either smart software or robots. A study out of Oxford University also suggested that as much as 35 percent of existing jobs in the U.K. could be at risk of automation inside the next 20 years.


IBM Watson Data Kits speed enterprise AI development

istock-871148930.jpg
More than half of data scientists said they spend most of their time on janitorial tasks, such as cleaning and organizing data, labeling data, and collecting data sets, according to a CrowdFlower report, making it difficult for business leaders to implement AI technology at scale. Streamlining and accelerating the development process for AI engineers and data scientists will help companies more quickly gain insights from their data, and drive greater business value, according to IBM. "Big data is fueling the cognitive era. However, businesses need the right data to truly drive innovation," Kristen Lauria, general manager of Watson media and content, said in the release. "IBM Watson Data Kits can help bridge that gap by providing the machine-readable, pre-trained data companies require to accelerate AI development and lead to a faster time to insight and value. Data is hard, but Watson can make it easier for stakeholders at every level, from CIOs to data scientists."


An Incredible New Type of Brain Implant Can Boost Memory by 15%

main article image
By fine tuning the electrical activity of the probes, the scientists aimed to activate key components of the brain's memory network only when it struggled to store memories, but not when it was working fine. The basic concept itself of boosting memorisation and recall through neural stimulation is old ground. Neuroscientists have gradually progressed from using non-invasive Transcranial Magnetic Stimulation techniques to deep brain stimulation in an effort to tickle the right pathways and encourage the brain to store and reconnect with memories. While there have been encouraging successes by precisely targeting areas such as the hippocampus and medial temporal lobes, the results haven't always been consistent. Part of the problem could be the choice of location, but another issue could be the method. Past efforts have used what's called an open loop system, meaning the stimulation wasn't tweaked in response to the brain's activity.


Hyperconverged infrastructure gets its own Gartner magic quadrant

data center hyperconvergence
Hyperconvergence is an IT framework that combines storage, virtualized computing and networking into a single system to reduce data center complexity and increase scalability. Hyperconverged platforms include a hypervisor for virtualized computing, software-defined storage, and virtualized networking. The promise of hyperconverged infrastructure is simplicity and flexibility compared with legacy solutions. The integrated storage systems, servers and networking switches are designed to be managed as a single system, across all instances of a hyperconverged infrastructure. As hyperconvergence has caught on among enterprises, major system vendors have gotten into the action by acquiring startups or bunding their servers with HCI software through OEM arrangements. Gartner’s new magic quadrant specifically focuses on vendors that develop the core hyperconvergence software. The new magic quadrant drops the system hardware requirement that’s part of the HCIS appliance model, Gartner says.


A.I. and speech advances bring virtual assistants to work

01virtual assistant faceoff2 100686099 orig
“It is that idea of ambient computing, the idea that at any time I could just say ‘Alexa start my meeting,’ ‘Alexa how are my sales figures?’ or ‘Alexa I forgot to shut off the projector in the conference room please shut that off for me.’ “It is very natural, it is very spontaneous,” Ibitski said. “And all of this is happening because of the advancements we have seen in what is called NLU, or natural language understanding. And that is the difference – it is understanding context.”  Collin Davis, general manager of Alexa for Business, said virtual assistants are already helping employees get work done. “What we are finding is a really interesting shift is happening, where voice is offering up almost another dimension of multi-tasking, where workers sitting at their desk can use Alexa almost as a vocal multi-taskerto be able to get information quickly without losing focus,” Davis said. “You could be working on a report and you need to know how many deals closed last quarter without having to reach into your pocket or find an app or switch websites – you just get the information that you need.”



Quote for the day:


"Leadership appears to be the art of getting others to want to do something you are convinced should be done." -- Vance Packard


Daily Tech Digest - March 19, 2018

Linux Foundation unveils open source hypervisor for IoT products

acrn.jpg
The Linux Foundation recently unveiled ACRN (pronounced "acorn"), a new open source embedded reference hypervisor project that aims to make it easier for enterprise leaders to build an Internet of Things (IoT)-specific hypervisor. The project, further detailed in a press release, could help fast track enterprise IoT projects by giving developers a readily-available option for such an embedded hypervisor. It will also provide a reference framework for building a hypervisor that prioritizes real-time data and workload security in IoT projects, the release said. ACRN is made up of the hypervisor and its device model, the release noted. This is complete with I/O mediators. Firms like Intel, LG Electronics, Aptiv, and more have already contributed to the project. "ACRN's optimization for resource-constrained devices and focus on isolating safety-critical workloads and giving them priority make the project applicable across many IoT use cases," Jim Zemlin, executive director of The Linux Foundation, said in the release.



Java at a crossroads: Why the popular programming language needs to evolve to stay alive

istock-655144482-1.jpg
Java is used most often in cloud computing, data science work, web development, and app development, said Karen Panetta, IEEE fellow and dean of graduate engineering at Tufts University. "I still see it evolving, and very popular," Panetta said. While languages such as Python are growing as well, Java is adapting to the increasing number of deep learning and machine learning workloads. "There's becoming a lot of libraries out there that are compatible for deep learning," Panetta said. "I think the fact that we keep talking about cloud computing and all of those things, that Java is still going to be the dominant player." Java also has built in more security options than Python, so it's a good option for Internet of Things (IoT) applications, Panetta said. Java has a foothold everywhere, and large user groups and libraries already written, making it a natural pathway for machine learning, Panetta said. "It's evolving to meet the needs," she added.


FPGA maker Xilinx aims range of software programmable chips at data centers

The first product range in the category is code-named Everest, due to tape out (have its design finished) this year and ship to customers next year, Xilinx announced Monday. Whether it’s an incremental evolution of current FPGAs or something more radical is tough to say since the company is unveiling an architectural model that leaves out many technical details, like precisely what sort of application and real-time processors the chips will use. The features that we do know about are consequential, though. Everest will incorporate a NOC (network-on-a-chip) as a standard feature, and use the CCIX (Cache Coherent Interconnect for Accelerators) interconnect fabric, neither of which appear in current FPGAs. Everest will offer hardware and software programmability, and stands to be one of the first integrated circuits on the market to use 7nm manufacturing process technology (in this case, TSMC’s). The smaller the manufacturing process technology, the greater the transistor density on processors, which leads to cost and performance efficiency.


Ethernet bandwidth costs fall to a six-year low


Cloud provider demand for more throughput increased last year's average bandwidth per switch port connection to almost 17 Gb from 12 Gb in 2016, according to the latest report from Crehan Research Inc., based in San Francisco. "Public, private and hybrid cloud providers are looking to deploy much faster networks within and between data centers in order to handle the myriad of new and existing applications that their customers need," Seamus Crehan, president of Crehan Research, said in a statement. "In turn, the data center switch vendors are responding by offering significantly more bandwidth at little or no additional cost." The net result in 2017 was impressive increases in Ethernet bandwidth, port shipments and revenue in the branded switch market, Crehan said. Revenue rose 10% -- the highest annual growth in four years. 


What CISOs must know about DFARS and NIST to be compliant

Privilege management and application control map to many of the different controls within the guidelines – and it’s hardly surprising given the proven effectiveness of the two security controls when combined with the visibility it provides. We know that privilege management allows admin rights to be applied to applications as needed – rather than giving the user too much access. Application control is the part that allows us to whitelist or blacklist an application from running at all. The good thing about these two technologies together is that they’re great a “bang-for-the-buck.” Between them, they overlap to address controls in access control, audit and accountability, configuration management, maintenance and system and information integrity. Compliance is crucial for CISOs because those who fail to comply will likely lose government contracts. Organizations that are able to demonstrate compliance at an early stage may be in a better position to secure additional wins.


IT’s Most Wanted: 16 Traits Of Indispensable IT Pros

IT̢۪s most wanted: 16 traits of indispensable IT pros
Taking fresh looks at old problems is an essential part of the digital transformations that are changing many organization’s cultures, leading to approaches like DevOps and agile and incorporating emerging technical solutions such as AI and IoT, says Christoph ...  “There is one thing that IT staff cannot afford — and that’s to stand still,” Goldenstern says. “The willingness to learn and keep evolving, making yourself vulnerable in the process, is absolutely essential to staying relevant and being a growth driver in a constantly evolving business.” ... “In order to succeed in IT you need to have the ability to look at a problem, analyze it, and find a way to solve it,” Martini says. “I look for people who understand their strengths and weaknesses. These are the people that are most capable of learning on the job, while still improving the team’s ability overall. Look at raw potential. If you have potential and the drive to improve, the rest will follow.”


Disaster and Contingency Planning Lessons from the ICU

It’s inaccurate to accuse Memorial Hospital of not having a disaster plan. Theirs was 246 pages long. They had a designated disaster coordinator. What they didn’t have was a leader who had looked ahead multiple steps. They also failed to convert the generic pieces into living, breathing human beings whose survival hinged on what moves they made next. No one at Memorial knew that the surrounding levees would break after Katrina, isolating the hospital. That the generators, whose move to a higher floor had always fallen to a lower budget priority than some other need, would be incapacitated by flooding. That the presence of patients of a provider that was leasing the seventh floor would multiply the census of extremely ill patients exponentially. ...  That same physician was subsequently charged with second-degree murder. She was accused of choosing to euthanize the sickest patients without their consent. A comprehensive review of the situation indicates that, at a minimum, people in authority were scrambling to deal with their pieces of the game. No one was watching the whole board.


Android Oreo: 18 advanced tips and tricks

Android Oreo statue at Google
Got a notification you don't want to deal with immediately — but also don't want to forget? Use Oreo's super-handy (but also super-hidden!) snoozing feature: Simply slide a notification slightly to the left or right, then tap the clock icon that appears along its edge. That'll let you send it away for 15 minutes, 30 minutes, one hour, or two hours and then have it reappear as new when the time is right. ... Another new Oreo feature is the system-level ability for launchers to display dots on an app's home screen icon whenever that app has a notification pending — yes, much like the notification badges on iOS. Unlike iOS, though, Android already has an excellent system for viewing and managing notifications, which can make this addition feel rather redundant and distracting. But wait! Here's a little secret: You can disable the dots — if you know where to look. Mosey on back to the Apps & Notifications section of your system settings, then tap the line labeled "Notifications" and turn off the toggle next to "Allow notification dots."


Predictive maintenance: One of the industrial IoT’s big draws

Predictive maintenance: One of the industrial IoT̢۪s big draws
CarForce is mostly focused on selling its product to garages, but Lora said that the potential beneficiaries are numerous. In the garage use case, mechanics can get real-time maintenance data from vehicles they service, which offers both the ability to warn customers of impending problems and to correlate large data sets together to help predict future reliability issues. It's a value-add because the garage can stay a step ahead of mechanical issues – an alert goes off, and the garage can contact the customer to schedule maintenance. Even an awareness that customer X might be coming in for an oil change on a given day can help with planning and scheduling. “If you look at the big data/AI path, step one is just seeing the data,” said Lora. It’s part of what she refers to as the “lilypad” approach to development – building one system to enable a leap to the next lilypad, and so on. CarForce plans to operate on a population level - predicting reliability and failures across big swaths of the automotive landscape.


The benefits of machine learning in network management


The problem with rule-based systems is they require maintenance and frequent updating as new rules are needed. It is often too cumbersome to create rules where numerous changes in the conditions require very different results. In addition, these systems are not very flexible. The rule sets may miss a problem if the rule set in question doesn't exactly match the problem's symptoms. It's much better to build a system that can learn about problems from the network experts who use it -- much like training a person who is new to the field of networking. Then, as new problems and solutions are found, the system would learn the symptoms and the resulting actions to take. Most of the industry agrees the integration of AI is among the benefits of machine learning. For our purposes, think of machine learning and deep learning as examples of neural network technology. A neural network is trained when it is fed a lot of data from the domain in question -- along with the appropriate answer or response. The neural network learns the appropriate response when presented with new data.



Quote for the day:


"My past has not defined me, destroyed me, deterred me, or defeated me, it has only strengthened me." -- Steve Maraboli


Daily Tech Digest - March 18, 2018

The Differences Between Machine Learning And Predictive Analytics

machine learning, predictive analytics
Machine learning applications can be highly complex, but one that’s both simple and very useful for business is a machine learning algorithm that compares employee satisfaction ratings to salaries. Instead of plotting a predictive satisfaction curve against salary figures for various employees, as predictive analytics would suggest, the algorithm assimilates huge amounts of random training data upon entry, and the prediction results are affected by any added training data to produce real-time accuracy and more helpful predictions. ... Predictive analytics can be defined as the procedure of condensing huge volumes of data into information that humans can understand and use. Basic descriptive analytic techniques include averages and counts. Descriptive analytics based on obtaining information from past events has evolved into predictive analytics, which attempts to predict the future based on historical data. This concept applies complex techniques of classical statistics, like regression and decision trees, to provide credible answers to queries


Creating With Cognitive


“Our IAIC initiatives were really born in response to our Clients. They said to us, ‘We’re intrigued by the concept of Cognitive Automation — but we don’t really know how it can affect our business’. We knew we had to create a safe place where companies could experiment combining process based automation solutions like Robotics Process Automation (RPA) and Business Process Management (BPM) with Autonomic and Cognitive Assets. Our process starts with an Education session to demystify different types of automation and arrive on common definitions. We then begin Ideating on how this technology could impact the overall organization and business processes. Here we take a hard look at User Experience, End to End Process Views and what the “art of the possible” could become in the future. Next, we move to a Strategy session, where we bring data scientists, business analysts, software engineers and developers together to re-imagine what the client — and their customers — really need to meet their expectations and begin applying the defined technologies to actual client use cases.


Data-Savvy Banking Organizations Will Destroy Everyone Else

“To win in today’s market and ensure future viability, it is essential that organizations capture value quickly, change direction at pace, and shape and deliver new products and services. Organizations also need to maximize the use of ‘always on’ intelligence to sense, predict and act on changing customer and market developments,” said Debbie Polishook, group chief executive, Accenture. The good news is that, despite what appears to be an ominous future, over 40% of executives see more opportunities than threats compared to two years ago. The key is to break down silos and leverage data and insights to support both internal and external business needs. Intelligent organizations have five essential ingredients that contribute to a lasting and impactful business process transformation. These essential provide the foundation for an agile, flexible and responsive organization that can act swiftly to market and consumer changes and be in a better position to succeed


Where NEXT for Tech Innovation in 2018?


Healthcare is an industry that is ripe for disruption. We will begin to see the power of IoT in healthcare with the emergence of inexpensive, continuous ways to capture and share our data, as well as derive insights that inform and empower patients. Moreover, wearable adoption will create a massive stream of real-time health data beyond the doctor’s office, which will significantly improve diagnosis, compliance and treatment. In short, a person’s trip to the doctor will start to look different – but for the right reasons. Samsung is using IoT and AI to improve efficiency in healthcare. Samsung NEXT has invested in startups in this area, such as Glooko which helps people with diabetes by uploading the patient’s glucose data to the cloud to make it easier to access and analyse them. Another noteworthy investment in this space from Samsung NEXT is HealthifyMe, an Indian company whose mobile app connects AI-enabled human coaches with people seeking diet and exercise advice.


Security Settles on Ethereum in First-of-a-Kind Blockchain Transaction

“It’s quite exciting that you can now leverage any clearing system, and it’s legally enforceable on even a public blockchain,” said Avtar Semha, founder and CEO of Nivaura, whose technology was used last year to issue an ethereum bond. Semha says it’s unclear with the note being issued Friday exactly how much will be saved on the overall cost of the transaction. But he added that in the ethereum bond last year the final cost was reduced from an estimated 40,000 pounds to about 50 pounds, “which is pretty awesome,” he said. Further, law firm Allen and Overy helped ensure the note was compliant, the Germany-based investment services firm Chartered Opus provided issuance services and Marex helped fix and execute the note within a “sandbox” created by the U.K. Financial Conduct Authority (FCA). As revealed for the first time to CoinDesk, on March 14, Nivaura also received full regulatory approval from the FCA that removed some restrictions and allows the company to operate commercially.


IoT and Data Analytics Predictions for 2018

With the increased collection of Big Data and necessity of advanced analytics, the year 2018 will witness high usage of cloud-based analytics software rather than on-premises software. Reports suggest that more than 50% of businesses will opt cloud-first strategy for their initiatives around big data analytics. AI will completely revolutionize the way the organizations work today. Enterprises will take full advantage of Machine Learning to optimize infrastructural behavior, transform business process, and improve the decision-making process. Gartner states that AI is just the starting of the 75-year technological cycle and it will utilize revenue for 30% of market-leading businesses. According to Gartner, natural language will play a dual role as a source of input for many business applications and for a variety of data visualizations. The operational transformation is necessary to adopt algorithmic business with DNNs (deep neural networks) in the year 2018. This will be a standardized component in the toolbox of more than 80% of data scientists.


Beyond Copy-Pasting Methods: Navigating Complexity


Why is agility a good idea? When you take Jeff Sutherland’s book, it says Scrum: Doing twice the work in half the time. At first sight that looks like a pure efficiency issue. But if we ask experienced agilists why they are doing agile, they usually come up with a list of challenges for which agile works better than other approaches: users only half understand their own requirements. Requirements change because the context changes. You have to build technological substitution into your design. Your solution is connected to many other things, and they all interact and change. But also inside the project you know there will be unpredictable problems and surprises. If we look behind the obvious: what is the common force that makes these challenges behave the way they do? The answer is complexity. Exploring complexity has a big advantage. Once we understand more about the complexity behind the problems which we are trying to solve with agile, we in fact clarify the purpose of our agile practice.


Building a new security architecture from the ground up

Overseeing an infrastructure that is operating thousands of servers is a burden on any architecture team. Moving those servers—all or in part—to the cloud takes patience and innovation. The innovation part, Fry said, is key because “most commercial security products are designed and built for specific use cases. Scale and complexity typically are not present,” meaning that architects in those situations need to adapt ready-built products to their networks or develop new tools from scratch, all of which takes time, money, and skill. Further, not all parts of the network can be treated equally; enterprise and customer-facing environments differ from test environments differ from production environments. When dealing with networks like those at Yahoo or Netflix, the need to think “outside the box” and innovate are, “not desirable; it’s a requirement,” said Fry. Though a security architect may be primarily concerned about security features and controls, the business is primarily concerned about availability and uptime.


Enterprises need a new security architecture: Graeme Beardsell

Today, data, applications, and users are outside the firewall and on the cloud, where they traverse the public Internet. To paraphrase, traditional security systems are guarding a largely empty castle. This means that enterprises need new approaches in their concept of security and to build new security architecture. Essentially, security should be designed to take advantage of the shape of the Internet and not try to defy it. The other challenge that is perhaps ‘invisible’ is the large number of vendors and solutions that each organization needs to manage. Analysts now advocate rationalizing multiple solutions by different vendors into suites of solutions by a single vendor to provide greater efficiency in productivity, and also towards solutions that share an integrated platform to facilitate data exchange and analysis.


10 Lessons Learned from 10 Years of Career in Software Testing

There is nothing wrong in getting certified but it’s not compulsory. A good tester needs to possess testing skills like sharp eye for details, analytical and troubleshooting skills etc. and I believe no certification can prove that you are good at those mentioned skills. While writing test cases, none of us would prefer to think about boundary value analysis and decision tables specifically. What one needs is application of common sense on knowledge. Who would like a person who indicates litter in your balcony and makes you sweep it? No matter if he is helping to make something clean, mostly he won’t be appreciated. That is how the profession is! You might or might not be appreciated for the quality improvement work you are doing but you need to understand importance of what you are doing. And on timely basis, you need to pat on your back for the work you are doing.



Quote for the day:



"Character is much easier kept than recovered." -- Thomas Paine