Daily Tech Digest - October 24, 2018


Despite the well-publicized growth in cyber-attacks every year, both in number and complexity, organizations are still struggling to implement effective security policies. It’s no secret that weak passwords are a leading security threat and bad password habits are far too common. Yet organizations are struggling to quantify their own level of password risk, even those that use password managers. Why? They lack proof of their policies’ effectiveness. They’re missing visibility into their employees’ behaviors. And they can’t verify how they compare to others of similar size, industry or location, including competitors. That is why we undertook an effort to analyze the password habits of employees at 43,000 organizations of all sizes and across industries that use the LastPass password manager. Not only does the report reveal real password behaviors in the workplace, but it offers the first true benchmark that CISOs and other IT professionals can use to see how they rank compared to other similar businesses and how to improve their password security.



Is the IoT in space about to take off?
Last month, cloud leader Amazon Web Services (AWS) struck a deal with satellite provider Iridium to “bring internet connectivity to the whole planet.” The deal calls for them to develop a satellite-based network called CloudConnect, designed specifically for IoT applications. Similarly, earlier this month, U.S.-based Orbcomm, which provides satellite IoT and machine-to-machine communications services, announced it will work with Asia Pacific Navigation Telecommunications Satellite (APNTS) to provide its services in China. Also in October, SemTech and Alibaba Cloud agreed to develop an IoT network in China using small satellites in low Earth orbit — reportedly just two of many companies looking to build such networks. The IOTEE Project (Internet of Things Everywhere on Earth), for example, has been funded by the European Union to provide IoT LPWA services from space. It’s unclear whether it’s the right time for these efforts to come to fruition. There is a market available: It turns out that despite their rapid proliferations, conventional terrestrial networks cover only a small percentage of Earth’s surface.


The issue was in the source code of the jQuery File Upload plugin, originally developed by Tschan, so the vulnerability could affect many other projects. According to GitHub, jQuery File Upload is the most starred -- meaning users mark it in order to signal interest and support -- jQuery plugin and also the most forked. Cashdollar said the plugin has been forked more than 7,800 times and could have been built in to thousands of other projects, making it difficult to determine how widespread the jQuery plugin vulnerability could be. "Unfortunately, there is no way to accurately determine how many of the projects forked from jQuery File Upload are being properly maintained and applying changes as they happen in the master project," Cashdollar wrote. "Also, there is no way to determine where the forked projects are being used in production environments if they're being used in such a way. Moreover, older versions of the project were also vulnerable to the file upload issue, going back to 2010."


How science can fight insider threats

Detecting insider threats using conventional security monitoring techniques is difficult, if not impossible. ... The emerging field of security analytics uses machine learning technologies to establish baseline patterns of human behavior, and then applies algorithms and statistical analysis to detect meaningful anomalies from those patterns. These anomalies may indicate sabotage, data theft, or misuse of access privileges. This can be accomplished by establishing a contextual linked view and behavior baseline from disparate systems including HR records, accounts, activity, events, access repositories, and security alerts. This baseline is created for the user and their dynamic peer groups. As new activities are consumed, they are compared to the baseline behaviors. If the behavior deviates from the baseline, it is deemed an outlier. Using risk scoring algorithms, outliers can be used to detect and predict abnormal user behavior associated with potential sabotage, data theft or misuse.


Datacentre glitches expose data loss risks

The research found that 29% of respondents had suffered one or two events of data loss because of their datacentre provider letting them down – with 18% saying they had suffered data losses on three or more times during the past 12 months. Jon Arnold, managing director at Volta Data Centres, said: “Outages and data loss can be due to a variety of factors, such as network glitches, human error or inadequate maintenance, but whatever the reason, organisations need to be taking a far more robust approach to datacentre due diligence. “Where is the guarantee of 100% uptime? What power resilience is in place? How many different connectivity options are available – and do they run across different networks for greater contingency? These are all questions businesses need to ask when choosing datacentre providers – or face the risk of more downtime.” The survey also showed that 35% of organisations still locate IT assets mainly on-premise, with 29% shifting mainly to the cloud.


Culture the missing link for cybersecurity's weakest link

Hibbs said that if he could take the humans out of the loop then the risk would drop to zero, but obviously that's incompatible with the reality of human communication within and between organisations made of, you know, humans. "I think we'll always be in that state. While we do need to make them more vital team members, we need to change the culture, which is very critical to reduce it, but there'll always be that risk." But focusing on phishing awareness training and the like is "too much of a tactical response", according to Valerie Abend, who heads up Accenture's global cyber regulatory services. "In order for us to get ahead of it, more than just focusing on that phishing aspect and not further risk, the bad guys are just going to keep outsmarting us. We have to be a little bit strategic on where we're focusing raising the level of attention and awareness," Abend said. Awareness-raising and anti-phishing campaigns are important, she said, but organisations need to raise the level of board and senior management involvement in managing the risk.


Google just quietly gave us a killer midrange Android option

Google Pixel Midrange Android
Google itself is selling the Pixel 2 for $649, in its lowest configuration, and you can find mint-quality used models in the $400 range. Those prices only seem likely to inch downward as time wears on. But there's more: Just think how this situation will spread starting nextyear, when the Pixel 2 will be two years old and yet still have a full year of pending updates under its belt. You'll essentially have a menu of pricing points available for any budget: the current-gen model, with three full years of updates included; the previous-gen model, with two solid years of support still ahead; and the two-year-old version, with a year's worth of foundational improvements still remaining. Google's software focus is thus not only altering the lifespan and value of a flagship phone; it's also completely changing what it means to get a midrange or budget-level phone, thanks to that cascading effect. And even if Google itself doesn't opt to keep selling those older models after a while, the used phone marketplace will provide an intriguing new level of aftermarket value.


8 ways to successfully get AI and analytics into production


“When you build a production analytic or AI system, there are two parts of the problem. One is having the right data and data access, and the other part of the problem is the analytics: actually running the software to analyze the data. Analytics applications require a lot of coordination, and with the increasingly widespread containerization of applications, it’s essential to have a way to coordinate processes running in containers. Kubernetes, an open-source orchestration system for managing deployment of containerized applications, is emerging as a leading solution. But to avoid being limited as to which applications can be containerized, you need a data platform with the capability to persist data (state) from containerized applications as a variety of data structures. This powerful combination of Kubernetes and an appropriate data platform offer a big advantage for production systems.”


GreyEnergy threat group detected attacking high-value targets


Cherepanov and Lipovsky said the similarities between GreyEnergy and BlackEnergy -- overlap in malware frameworks and code, overlap in targets and regions of activity, the timing of GreyEnergy beginning activity and both groups using active Tor relays for command and controlservers -- all indicate that GreyEnergy is the successor to BlackEnergy. However, although experts praised the research by ESET, not all agreed that the evidence supported the connection between the groups or any conclusions that GreyEnergy is specifically targeting ICS infrastructure. Robert Lee, founder and CEO of Dragos Inc., noted on Twitter that the GreyEnergy "tool is a general backdoor and doesn't contain ICS capabilities but neither did BlackEnergy3." "I think it's premature to make assessments on adversary intent, with only three identified victims the focus may be larger than ICS and assessing how the adversary might use the access would be low confidence at best," Lee wrote on Twitter.


CIOs and the cloud: The future of European enterprise software


“The cloud helps when providing compliance in terms of GDPR and governance,” he said. “If we didn’t use the cloud, I’m not even sure how we’d tackle those requirements. Because we use the cloud, we’ve had to work out where all our data resides and that means we’re in a great place in terms of security and legislation.” “We know where our information sits and we can then just apply policies as we need to. Speaking to other CIOs, I don’t think other businesses in other sectors are always in that position. That’s a living nightmare.” That view resonates with Martyn Wallace, chief digital officer for the Scottish Local Government Digital Office. Like Dowden, Wallace believes too many executives fear going all-in with the cloud and believe information is only safe in an internal data centre. Naysayers should recognise the power of working with a technology specialist like Amazon, Google or Microsoft, who have the weight to ensure data stays safe and secure.



Quote for the day:


"The most common way people give up their power is by thinking they don't have any." -- Alice Walker


Daily Tech Digest - October 23, 2018

A New Era of Adobe Document Cloud and Acrobat DC

PDF started out as a noun, but quickly became a verb. The adjective “portable,” an essential descriptor in the pre-internet era of 1993, doesn’t begin to capture all the rich capabilities of today’s PDF. The PDF is still portable of course, but it’s also editable, reviewable, reliable and sign-able. And it’s universal: in the last year alone, some 200 billion PDFs were opened in Adobe products alone. And we’re accelerating the pace of innovation in PDF and the role it will play for future generations. Today, we’re announcing major advancements in Adobe Document Cloud and Acrobat DC, redefining what’s possible with PDF. This launch includes new PDF share and review services, dramatic enhancements to touch-enabled editing on tablets, and a redesigned way to send documents for signature with Adobe Sign, all built in to Acrobat DC. It’s all about making it easier to create, share and interact with PDFs, wherever you are. New Adobe Sensei-powered functionality automates repetitive tasks and saves time, whether you’re scanning a document, filling out a form or signing an agreement.


Establishing an AI code of ethics will be harder than people think

In an attempt to highlight how divergent people’s principles can be, researchers at MIT created a platform called the Moral Machine to crowd-source human opinion on the moral decisions that should be followed by self-driving cars. They asked millions of people from around the world to weigh in on variations of the classic "trolley problem" by choosing who a car should try to prioritize in an accident. The results show huge variation across different cultures. Establishing ethical standards also doesn’t necessarily change behavior. In June, for example, after Google agreed to discontinue its work on Project Maven with the Pentagon, it established a fresh set of ethical principles to guide its involvement in future AI projects. Only months later, many Google employees feel those principles have been placed by the wayside with a bid for a $10 billion Department of Defense contract. A recent study out of North Carolina State University also found that asking software engineers to read a code of ethics does nothing to change their behavior.


Quantum computing: A cheat sheet


Using quantum computation, mathematically complex tasks that are at present typically handled by supercomputers — protein folding, for example — can theoretically be performed by quantum computers at a lower energy cost than transistor-based supercomputers. While current quantum machines are essentially proof-of-concept devices, the algorithms which would be used on production-ready machines are being tested presently, to ensure that the results are predictable and reproducible. At the current stage of development, a given problem can be solved by both quantum and traditional (binary) computers. As manufacturing processes used to build quantum computers is refined, it is anticipated that they will become faster at computational tasks than traditional, binary computers. Further, quantum supremacy is the threshold at which quantum computers are theorized to be capable of solving problems, which traditional computers would not (practically) be able to solve. Practically speaking, quantum supremacy would provide a superpolynomial speed increase over the best known (or possible) algorithm designed for traditional computers.


The Untapped Potential in Unstructured Text


Unstructured text is the largest human-generated data source, and it grows exponentially every day. The free-form text we type on our keyboards or mobile devices is a significant means by which humans communicate our thoughts and document our efforts. Yet many companies don’t tap into the potential of their unstructured text data, whether it be internal reports, customer interactions, service logs or case files. Decision makers are missing opportunities to take meaningful action around existing and emerging issues. ... Natural language processing (NLP) is a branch of artificial intelligence (AI) that helps computers understand, interpret and manipulate human language. In general terms, NLP tasks break down language into shorter, elemental pieces, and tries to understand relationships among those pieces to explore how they work together to create meaning. The combination of NLP, machine learning and human subject matter expertise holds the potential to revolutionize how we approach new and existing problems.


How AI Is Shaking Up Banking and Wall Street

San Francisco–based Blend, for one, provides its online mortgage-application software to 114 lenders, including lending giant Wells Fargo, shaving at least a week off the approval process. Could it have prevented the mortgage meltdown? Maybe not entirely, but it might have lessened the severity as machines flagged warning signs sooner. “Bad decisions around data can be found instantaneously and can be fixed,” says Blend CEO and cofounder Nima Ghamsari. While banks are not yet relying on A.I. for approval decisions, lending executives are already observing a secondary benefit of the robotic process: making home loans accessible to a broader swath of America. Consumers in what Blend defines as its lowest income bracket—a demographic that historically has shied away from applying in person—are three times as likely as other groups to fill out the company’s mobile application. Says Mary Mack, Wells Fargo’s consumer banking head: “It takes the fear out.”


The Hidden Risks of Unreliable Data in IIoT

The Hidden Risks of Unreliable Data in IIoT
One of the key goals of Industry 4.0 is business optimization, whether it’s from predictive maintenance, asset optimization, or other capabilities that drive operational efficiencies. Each of these capabilities is driven by data, and their success is dependent on having the right data at the right time fed into the appropriate models and predictive algorithms. Too often data analysts find that they are working with data that is incomplete or unreliable. They have to use additional techniques to fill in the missing information with predictions. While techniques such as machine learning or data simulations are being promoted as an elixir to bad data, they do not fix the original problem of the bad data source. Additionally, these solutions are often too complex, and cannot be applied to certain use cases. For example, there are no “data fill” techniques that can be applied to camera video streams or patient medical data. Any data quality management effort should start with collecting data in a trusted environment.


AI can’t replace doctors. But it can make them better.


It’s not that we don’t have the data; it’s just that it’s messy. Reams of data clog the physician’s in-box. It comes in many forms and from disparate directions: objective information such as lab results and vital signs, subjective concerns that come in the form of phone messages or e-mails from patients. It’s all fragmented, and we spend a great deal of our time as physicians trying to make sense of it. Technology companies and fledging startups want to open the data spigot even further by letting their direct-to-consumer devices—phone, watch, blood-pressure cuff, blood-sugar meter—send continuous streams of numbers directly to us. We struggle to keep up with it, and the rates of burnout among doctors continue to rise. How can AI fix this? Let’s start with diagnosis. While the clinical manifestations of asthma are easy to spot, the disease is much more complex at a molecular and cellular level. The genes, proteins, enzymes, and other drivers of asthma are highly diverse, even if their environmental triggers overlap.


4 steps for solution architects to overcome RPA challenges


As important as it is for solution architects to talk about the technology, it’s equally crucial that they listen to concerns too. Especially when discussing automation – or even just mentioning the word robotics, architects must realize it’s natural for some groups to have anxiety about being replaced or their positions eliminated. This belief often stems from a lack of education about the technology. In times like these, solution architects need to understand where clients are coming from and leverage the questions as opportunities to communicate the real benefit of RPA: which is to make employee lives easier by removing tedious tasks like data entry that most workers don’t enjoy. Managers also need to know that this level of automation will likely result in greater trust in the accuracy of the data, which is something that can’t be easily assessed when humans are doing the data extraction and entry. Overcoming executives’ fears about job security is vital to a successful implementation, because with better client education, comes more comfort with RPA


Feds Charge Russian With Midterm Election Interference

It's unclear how prosecutors zeroed in on Khusyaynova, who presumably remains in Russia. The U.S. and Russia do not have an extradition treaty, which means she can likely avoid arrest indefinitely, provided she remains there. The complaint against Khusyaynova contains a surprising amount of detail about how Project Lakhta was funded and its finances managed. The level of detail suggests that investigators managed to work with a source who was part of or closely affiliated with the project. The complaint describes financial documents, emails and paperwork that allegedly lay bare the source of the project's funding and its aims. The Justice Department alleges Project Lakhta was funded and ultimately controlled by a trio of companies operating under the name of Concord, run by Yevgeniy Viktorovich Prigozhin, described in the complaint as being "a Russian oligarch who is closely identified with Russian President Vladimir Putin."


The OSI model explained: How to understand the 7 layer network model

OSI model
For IT professionals, the seven layers refer to the Open Systems Interconnection (OSI) model, a conceptual framework that describes the functions of a networking or telecommunication system. The model uses layers to help give a visual description of what is going on with a particular networking system. This can help network managers narrow down problems (Is it a physical issue or something with the application?), as well as computer programmers (when developing an application, which other layers does it need to work with?). Tech vendors selling new products will often refer to the OSI model to help customers understand which layer their products work with or whether it works “across the stack”. Conceived in the 1970s when computer networking was taking off, two separate models were merged in 1983 and published in 1984 to create the OSI model that most people are familiar with today. Most descriptions of the OSI model go from top to bottom, with the numbers going from Layer 7 down to Layer 1.



Quote for the day:


"Have courage. It clears the way for things that need to be." -- Laura Fitton


Daily Tech Digest - October 22, 2018

Saudi SoftBank relationship and the Tesla miracle, are techs set to boom or crash? image
Human nature is a funny thing, and often has little to do with rational decision making. In 2008, the global economy descended into crisis, for a while it felt as if capitalism itself was tottering. The Queen of England famously asked: “Why didn’t anyone see it coming?” Actually, many did. But as a whole, economists and politicians who warned of a major crisis in the making, fell under the media radar or were dismissed as doomsayers. Yet, even among these ‘Cassandras’, few anticipated the full extent of the crisis to follow. There may be a good reason for this. A new book, Crisis of Beliefs: Investor Psychology and Financial Fragility, by the economists Nicola Gennaioli and Andrei Shleifer, argues that one of the reasons why the 2008 crisis was so severe is that people changed. It seems that human psychology may at least partially explain the crash of 2008, investors beliefs on the level of debt or leverage that was sustainable, for example, changed. And that’s the tricky thing about predicting stock markets. Human nature, especially when it is aggregated and subjected to forces such as group think, is notoriously difficult to understand, let alone predict.


Firms need stronger metrics and skills to outpace cyber threats

The use of security metrics and the formation of security teams should be viewed as complementary activities, though for many organizations some upskilling will be necessary, Robinson explained. "Foundational skills such as network security, endpoint security and threat awareness still form the bedrock of a strong team,” Robinson said. “But as the cloud and mobility have become ingrained into IT operations, other skills have taken on equal or greater importance.” In order to acquire the security skills organizations require, many are primarily looking to train current employees or expand their use of third-party security expertise. New hires and new partnerships are usually secondary considerations, Robinson explained. When it comes to the use of external resources, 78 percent of companies rely on outside partners for some or all of their security needs. Many firms rely on more than one partner, another indicator of the complexity of cybersecurity, Robinson explained.


FDA Calls for 'Cybersecurity Bill of Materials' for Devices

FDA Calls for 'Cybersecurity Bill of Materials' for Devices
"Because of the rapidly evolving nature of cyber threats, we're updating our [premarket] guidance to make sure it reflects the current threat landscape so that manufacturers can be in the best position to proactively address cybersecurity concerns when they are designing and developing their devices," says FDA Commissioner Scott Gottlieb, M.D. "This is part of the total product lifecycle approach to device safety, in which manufacturers must adequately address device cybersecurity from the design phase through the device's time on the market to help ensure patients are protected from cybersecurity threats." The draft guidance provides updated recommendations on cybersecurity considerations for device design, labeling and documentation that should be included in premarket submissions for agency approval of medical devices that have cybersecurity risk, FDA notes. The agency will conduct a public workshop for industry stakeholders on Jan. 29-30, 2019, to discuss the newly released draft guidance before it's finalized.


To beat Bloomberg, Symphony is letting banks’ bots talk with each other

“If you can create a whole ecosystem that connects every individual without dropping anyone, you can create a network much greater than what Bloomberg has done,” Gurle said in an interview on the sidelines of Symphony’s Innovate conference in New York recently. “The key is openness.” Gurle compares Symphony to America’s interstate highway system. In that analogy, the banks are cities and towns, and use their own cars to travel on a network Symphony has built. The advantage is that banks can use their own proprietary systems and still interface with other systems. Using Symphony, banks are deploying chatbots that “talk” amongst themselves to make and settle trades. Bots at RBC and AllianceBernstein, for example, can execute trades with each other over the Symphony platform, while BlackRock and BNP Paribas use them to settle mismatched foreign-exchange swaps.


How to make automation part of your microservices security


A modern application stack has four layers: infrastructure, data, networking and application code. At each of these layers, containers and microservices introduce a new way to deliver apps. As a result, container orchestration tools like Kubernetes are central to microservices management. While many security tools that work for standard applications produce effective results when applied to a microservices application, two aspects of microservices require additional attention and protection: application security and container security. Fortunately, there are plenty of advanced automation tools that support the fast and agile requirements of microservices security. Microservices application security is important because it involves multiple services rolled into one app. Those multiple services all work together to deliver a unified experience, and that means it's essential to perform dynamic testing on the services at the application level. In a microservices system, networking occurs between the services, as well as at the instance level.


Samsung Starts Mass Production of Chips Using Its 7nm EUV Process Tech

Samsung’s 7LPP manufacturing technology offers impressive advantages over the company’s 10LPE specifically for mobile SoCs. Meanwhile, in a bid to make the process attractive to a broad range of potential customers, the foundry offers a comprehensive set of design-enablement tools, interface IP (controllers and PHY), reference flows, and advanced packaging solutions. The final PDK is months away, but many customers may start development of their 7LPP SoCs even with existing set of solutions. At this point 7LPP is supported by numerous Samsung Advanced Foundry Ecosystem (SAFE) partners, including Ansys, Arm, Cadence, Mentor, SEMCO, Synopsys, and VeriSilicon. Among other things, Samsung and the said companies offer such interface IP solutions HBM2/2E, GDDR6, DDR5, USB 3.1, PCIe 5.0, and 112G SerDes. Therefore, developers of chips of SoCs due in 2021 and onwards, which will rely on PCIe Gen 5 and DDR5, can start designing their chips right now.


9 Principles of Service Design


It’s important to realize services are not tangible goods. An interface is not a service. A product is not a service. Shostack states, “People confuse services with products and with good manners. But a service is not a physical object and cannot be possessed. When we buy the use of a hotel room, we take nothing away with us but the experience of the night’s stay. When we fly, we are transported by an airplane but we don’t own it. Although a consultant’s product may appear as a bound report, what the consumer bought was mental capability and knowledge, not paper and ink. A service is not a servant; it need not be rendered by a person. Even when people are the chosen means of execution, they are only part of the process.” This makes it quite difficult to design for services. Often, the design of a service is overlooked by organizations and decisions related to the service supporting a product are not routinely considered in relation to how they impact the overall design of an experience. This results, most often, in poor service design and a poor experience.


Why Managed Threat Hunting?

Increasingly, threat hunting is a practice that enterprises want to understand and implement. But it is not always feasible to do so in-house, given the demand for resources and skills. That's where managed threat hunting enters, says CrowdStrike's Jennifer Ayers. Ayers, VP, OverWatch and Security Response at CrowdStrike, says the in-house/managed services decision is becoming a common, pragmatic discussion. "Companies want to be able to build out all this stuff, but in reality, if you only have $100, do you want to focus that $100 on building out a threat hunting organization that might only find evil once or twice a year in your particular environment, or do you want to use that funding to shore up your defense and response to those types of attacks?" In an interview on managed threat hunting, Ayers discusses: Her perspective on threat hunting; In-house vs. outsourced threat hunting; and The latest threats and how to defend against them.


Public cloud management tools lacking, research finds


Network engineer Brian Keys took a look at network resiliency and why it's so difficult for enterprises to have a network that's highly available. For one thing, nobody wants to pay for the technology necessary to achieve that goal. Additionally, finding architects with the experience to design a highly available network isn't easy. Still, Keys said, enterprises can take steps to improve their network's reliability. The use of uninterruptable power supplies is a good approach. So are redundant links for branch office connectivity. But knowing which techniques are necessary and which ones are just nice to have requires careful study. "A competent network designer should be able to tell with a high degree of certainty just how resilient the network is and in which ways," Keys said. "Probably the toughest part is to explain to upper management the pros and cons of the new proposal and get their buy-in."


The Hub of All Things: Are you collecting personal data the wrong way?

Are you collecting personal data the wrong way? image
It’s a radical shift from the way organisations collect and access personal data, but Holtby thinks is better not only for consumers but for organisations too. He explained: “Most companies treat the personal data of their users in a way that is, at best, hamstrung and at worst completely dysfunctional. “I would argue in the future most companies are going to want to have a pretty clear understanding of who their users are and who their customers are. They want to know as much as they can about those people. At best even the very biggest companies, today, have a very limited understanding of who their users are. “The quintessential ‘I know who my user is kind of company’, I would argue at the moment, is Google. Many think of Google as being the company that has the most data about its users. If you are being charitable to Google you could say it knows everything it would possibly want to know about its users, but in reality, all they have is Google’s data.



Quote for the day:


"Leaders must be good listeners. It_s rule number one, and it_s the most powerful thing they can do to build trusted relationships." -- Lee Ellis


Daily Tech Digest - October 21, 2018

onelensfigalwhitneyoct2018.jpg
Do you have a lot of receipts, business cards, or other printed documents that you want to digitize and store? If so, one tool up to this job is Microsoft's free mobile Office Lens app. With Office Lens for iOS or Android, you can use your device's camera to snap a photo of a note, card, or other document. You can capture the image as a whiteboard, a document, a business card, or a photo. Then, you can then edit and revise it by cropping it, flipping it, drawing on it, and adding text to it. When you're done, you can store the image as a PDF file, a Word document, a PowerPoint slide, or a OneNote file. You can also save the image to your mobile gallery or to Microsoft OneDrive; in fact, the latest version of OneDrive for iOS directly integrates Office Lens. Let's look at how to use Office Lens to capture your printed documents. First, download and install the Office Lens app on your iPhone, iPad, or Android device. Open the app and give it the necessary permission to access your photos and camera. 


Continuous Integration at Intel for the Mesa Graphics Library

Mesa CI is a set of configuration files, a job scheduler and a job implementation that can run on Jenkins. Written mostly in Python, it is driven by the principle that "the most important design consideration for the Mesa CI is to minimize configuration in Jenkins". The Mesa CI can theoretically run on top of any CI infrastructure, not just Jenkins, according to the documentation. It’s currently used for developer testing, release verification, pre-silicon (hardware) testing in simulators for Intel drivers, performance testing and validation of conformance test suites. The typical developer testing turnaround time is 30 minutes even though a commit to the master branch kicks off millions of tests. A custom database provides immediate access to test history, and the system also generates performance trend lines for common benchmarks.


Integrating factor of Big data and Artificial intelligence in Business


Now, companies and businesses have the chance to explore the potential of AI with seemingly inexhaustible data as opposed to what was previously obtainable thus unraveling all the intricate aspects of the process. As opposed to depending on sample data, experts are now able to utilize unquantifiable amounts of data. This advancement has propelled enterprises to a point where they can deliver content without any form of irrelevant data while offering higher suggestive and extrapolative data that is valuable for interpreted with the aid of “analytical sandboxes” or big data “centers of excellence”.  ... The landscape of artificial intelligence has experienced an explosive advancement with the accessibility to big data thus triggering disorderly transformations. The widespread explosion of data, in addition to the advancement in the capacity to store and evaluate staggering amount of data with efficiency and pace is directly responsible for propelling the relevancies of AI. This transcends the conventional role of analyzing data. More than ever before, AI is becoming an invaluable tool for accurate assessment and decision taking.


Why Digital Banking Should Include A Human Component

While effective implementation of digital strategy is critical for banks and credit unions, human interaction cannot be ignored. Technology can be used to augment the human experience and empower both customers and employees. A most common example is customer service; while chatbots and AI can be deployed to address most of the use cases, we must ensure that there are options for humans to intervene when needed, as well as human touchpoints throughout the customer journey to build trust and rapport. Bots need to be trained to learn how to empathize, and understand regional and generational differences. After all, such technology should reflect a brand’s identity and can positively (or negatively) impact customer perception. At the end of the day, technology is just a means to an end. The winning formula is not about more or less shiny new toys – but rather, leveraging appropriate technology to meet the needs of customers. The future of finance is also not about having a pretty user interface or making small incremental changes.


How Close Are We to Kubrick's AI-Controlled Vision of the Future?


HAL learned from observing its environment, watching and analyzing the words, facial expressions and movements of the human astronauts on the spaceship. It was responsible for performing rote functions such as maintaining the spaceship, but as a "thinking" computer, HAL also was capable of responding conversationally to the astronauts, Murphy explained. However, when the mission goes awry and the astronauts decide to shut HAL down, the AI discovers their plot by lip-reading. HAL arrives at a new conclusion that wasn't part of its original programming, deciding to save itself by systematically killing off the people onboard. The prospect of AI doing more harm than good may not be that farfetched. Experts suggest that weaponized AI could play a big part in future global conflicts, and the late physicist Stephen Hawking suggested that humanity might soon find AI to be the biggest threat to our survival.


Global Fintech Warning To Traditional Banks -- The Threat Is 'Real And Growing'

The UK fintech scene has been boosted by the local financial watchdog adopting EU so-called open banking rules early, forcing lenders to open up access for fintechs to the data and accounts of any clients who authorize it. Earlier this year the UK government created a crypto-assets task force, updated fintech regulation and built a UK-Australia so-called fintech bridge to help firms expand internationally. The rise in fintech firms and banking startups was sparked by the 2008 global financial crisis, which caused banks to cut back on spending and withdraw from some markets altogether — leaving a vacuum fintech companies stepped into. By using technology to make the finding, registering and lending to new customers quicker and easier these fintech companies have forced the traditional banking industry, which is famously slow to adapt, to react.


Confessions of a UX Designer


Some trends will stay in our profession. Those are usually theories based on sound foundational principles — the principles of our profession from decades of research and application. They’re like 501 Levis and a solid print t-shirt. They’ll never go out of style. But, most trends will fall by the wayside. Following them will often lead you and your project astray. As a general rule of thumb: Stay off the bandwagon, stray from the crowd and step to the beat of your own music — no matter how measured or far away. And as for those trending concepts that do stick around or seem to hold some validity — make sure you aren’t adopting them just because everyone else is. Don’t build a mobile app for a user base who would benefit more from a desktop application. Don’t adopt an idea based on an article you read citing a weak study or misrepresenting a study. In short, critically and strategically evaluate the concepts you add to your repertoire.


Embracing Conflict to Fuel Digital Innovation

When organizations try to determine the economic value of their data (EvD), there arises a nature conflict between 1) keeping all the data because of its potential monetization value versus 2) the potential storage and data management costs, not to mention potential fines and liabilities associated with data security and privacy breeches of that data, which highlights the following conflicts: Maximizing Value – Data assets have considerable potential economic or financial value they can add in terms of new revenue opportunities, process efficiencies, cost reductions, risk mitigation, etc. Monetizing these data sources the key to unlocking the potential in the big data era; and Minimizing Risk – Many organizations do not fully quantify the costs and risks associated with the corporate data. Denial of access to data such as we recently saw with the global wannacry cyberattack, is just one example of the risk inherent in underappreciating reliance on data. Data has both present and future value – and only once that value is fully understood can the risk be mitigated.


Agile Implementation from a Manager's Perspective


What is this strange role of an Agile Coach? On the board of the organization, a group of people who were to carry the torch of agile’s education appeared. They were to stimulate development of the agile development of software in teams, remove obstacles, talk to teams and their manager, and encourage them to think differently about the development of software and discourage the use of old methods. They started talking with my people, asking different questions over a cup of coffee or whatnot. They were sitting together with them in open spaces and observing what was happening. What were they talking about with my people? Or even worse, who were they informing and about what? They sneaked from one meeting to another as a mysterious Agent Smith. What was going on Why change something that works? I did not see the need to introduce Scrum, as my team was reaching the goals set for them without it. Better is an enemy of good ... do not touch it because you will get burnt.


AI, cybersecurity shape the CIO agenda for 2019 as IT budgets rise

Dynamism—the ability to embrace change and adopt technology—is the biggest predictor of digital transformation success, Mike Harris, executive vice president of research at Gartner, said during the opening keynote address at the Gartner Symposium/ITxpo. However, privacy is a top barrier to becoming dynamic. "If you don't successfully master privacy, your entire digital transformation is at risk," Harris said during the keynote. Businesses are increasingly scaling their digital efforts, the survey found: 33% of CIOs worldwide said they had evolved their digital endeavors to scale, up from 17% the year before. The major driver for scale is increasing consumer engagement through digital channels, the survey found. "The ability to support greater scale is being invested in and developed in three key areas: Volume, scope and agility. All aim at encouraging consumers to interact with the organization," Rowsell-Jones said in the release. 



Quote for the day:


"If you can't swallow your pride, you can't lead. Even the highest mountain had animals that step on it." - Jack Weatherford


Daily Tech Digest - October 20, 2018


Habits, it seems, get in the way of change despite our best intentions. “Habits are triggered without awareness — they are repeated actions that worked in the past in a given context or in a similar experience,” she notes. Wood’s research shows that concentrating on changing unwanted behaviors, and then creating new ones — not focusing on motivation — is the key to making change. She cites various efforts aimed at changing smoking habits in the U.S. from 1952 to 1999. Smoking decreased not when smokers were made aware of the health risks, but when buying and smoking cigarettes was made more difficult and less rewarding. Thus, higher taxes, smoking bans in public places, and limits on point-of-purchase ads — which add friction to smoking — were a more effective deterrent than warning labels on cigarette packages and public service advertising about smoking’s negative effects. A similar strategy of changing the context is possible in the workplace: Make old actions more difficult; make new, desired actions easier and more rewarding.


7 Ways A Collaboration System Could Wreck Your IT Security


Before an IT group blithely answers the call for a collaboration system – by which we mean groupware applications such as Slack, Microsoft Team, and Webex Team – it's important to consider the security risks these systems may bring. That's because the same traits that make these, and similar, applications so useful for team communications also make them vulnerable to a number of different security issues. From their flexibility for working with third-party applications, to the ease with which team members can sign in and share data, low transactional friction can easily translate to low barriers for hackers to clear. When selecting and deploying collaboration tools, an IT staff should be on the lookout for a number of first-line issues and be prepared to deal with them in system architecture, add-ons, or deployment. The key is to make sure that the benefits of collaboration outweigh the risks that can enter the enterprise alongside the software.


Apache Kafka: Ten Best Practices to Optimize Your Deployment


A running Apache ZooKeeper cluster is a key dependency for running Kafka. But when using ZooKeeper alongside Kafka, there are some important best practices to keep in mind. The number of ZooKeeper nodes should be maxed at five. One node is suitable for a dev environment, and three nodes are enough for most production Kafka clusters. While a large Kafka deployment may call for five ZooKeeper nodes to reduce latency, the load placed on nodes must be taken into consideration. With seven or more nodes synced and handling requests, the load becomes immense and performance might take a noticeable hit. Also note that recent versions of Kafka place a much lower load on Zookeeper than earlier versions, which used Zookeeper to store consumer offsets. Finally, as is true with Kafka’s hardware needs, provide ZooKeeper with the strongest network bandwidth possible. Using the best disks, storing logs separately, isolating the ZooKeeper process, and disabling swaps will also reduce latency.


The Evolution of Mobile Malware


Mobile malware isn’t just an opportunistic tactic for cybercriminals. Kaspersky Lab is also seeing its use as part of targeted, prolonged campaigns that can affect many victims. One of the most notable discoveries this year was Skygofree. It is one of the most advanced mobile implants that Kaspersky Lab has ever seen. It has been active since 2014, and was designed for targeted cyber-surveillance. It is spread through web pages, mimicking leading mobile network operators. This was high-end mobile malware that is very difficult to identify and block, and the developers behind Skygofree have clearly used this to their advantage: creating and evolving an implant that can spy extensively on targets without arousing suspicion. ... In recent times, rooting malware has been the biggest threat to Android users. These Trojans are difficult to detect, boast an array of capabilities, and have been very popular among cybercriminals. Once an attacker has root access, the door is open to do almost anything.


What is the CMO's Technology Strategy for 2019 and Beyond?

Two iPhones in someone's hand. One of the left says, "Technology is a given" on the screen, the one of the right says, "Not a debate" on the screen
Even the CMOs that don’t have the technological background are becoming more tech savvy. Integrate CMO Vaughan said he considers himself and his colleague marketers technology investors, trying to manage a portfolio of tech to provide efficiency, effectiveness and unique capabilities for the company. “We view technology as an enabler of our strategy and an important part of advancing our marketing capabilities,” Vaughan said. “We have tried to be very disciplined about not buying tech for tech sake, which is not always easy to do today with so many options. We start with the strategy, what we are trying to accomplish and build a roadmap, including ROI and an adoption plan and model for each technology we evaluate.” Vaughan said CMOs should know what is available and at their disposal to differentiate and accelerate their strategy. “This does not mean you have to be a technology expert,” he said.


Privacy, Data, and the Consumer: What US Thinks About Sharing Data

To prevent data being lost or stolen is the most obvious “table stake” for consumers. Just as important is the question of whether marketers should have it in the first place. This links clearly to the likes of GDPR in Europe where the bar has been raised for all organizations around justification of the data they hold. But if we have the right data, for the right reasons, if we keep it safe and if we can make it more transparent how we’re using that data to provide a more respectful, personalized, fairer and rewarding service to the consumer, the trust will grow. Equally, we need to trust the consumer, again by providing transparent access to the data we hold, clarity around how we use it and the ability for them to control their data. Overall, the research shows that while consumers are rightly concerned about data privacy, they are also aware that data is an essential part of today’s economy, with 57% on average, globally, agreeing or strongly agreeing. Factor in the neutrals and around two-thirds of consumers are accepting or neutral around data use in today’s data-driven, data-enabled world.


NHS standards framework aims to set the bar for quality and efficiency


Although most of the standards in the framework aren’t necessarily new, they are “intended to be a clear articulation of what matters the most in our standards agenda, and is accompanied by a renewed commitment to their implementation,” said NHS Digital CEO Sarah Wilkinson in the framework’s foreword. Speaking at the UK Health Show on 25 September, Wilkinson said the potential for use of data in the NHS is huge, but the health service needs to get to grips with standards to reap the benefits.  Most of the standards in the framework, which is currently in beta form and out for consultation, are based in international ones, however some are specialised for the NHS. This includes using the NHS number as a primary identifier – a standard which has been in place for a long time, but has had mixed results in uptake. The framework said the standard “is live now and should be adhered to in full immediately”. 


Open Banking has arrived, whether you like it or not

Australia has introduced Open Banking rules that will force the banks to share data with trusted Third-Party Providers (TPPs) by June 2019; Mexico has introduced a Fintech Law; South Korea and Singapore have enforced rules around financial data sharing between banks and third parties; and the USA has seen several banks innovating around open financial structures, although there is no law enforcing them to do this, yet. What intrigues me about the market movements is that some large financial players are taking a lead in this space, such as Citibank and Deutsche Bank’s open API markets, whilst some are resisting the change. I have heard several reports in the UK that the large banks have made data sharing incredibly difficult for the customer, by making the permissioning process very onerous and time-consuming. Equally, the implementation of European rules under PSD2 has seen several Fintech firms cry foul, as each bank creates its own interpretation, and therefore API interface, of the law.


How Data Changed the World


Running a city is always a challenging task. With Big Data, however, comes new opportunities alongside new challenges. Instead of having to rely on surveys and manually tracking how people move throughout an area, cities can instead rely on sensor-derived data, providing far greater resolution and a pool of data to draw from orders of magnitude larger than ever before available. Many of these advances may seem a bit mundane at first; developing improved traffic routes, for example, is unlikely to garner many headlines. However, these changes lead to concrete improvements, saving travelers time and improving overall quality of life. Furthermore, Big Data-derived improvements can inform city planners when deciding which direction their cities will take in the future. Before launching large and expensive projects, city managers will be able to look at information gleaned from Big Data to determine what the long-term effects will be, potential changing cities in fundamental ways.


Give REST a Rest with RSocket


An often-cited reason to use REST is that it’s easy to debug because its “human readable”. Not being easy to read is a tooling issue. JSON text is only human readable because there are tools that allow you to read it – otherwise it’s just bytes on a wire. Furthermore, half the time the data being sent around is either compressed or encrypted — both of which aren’t human readable. Besides, how much of this can a person “debug” by reading? If you have a service that averages a tiny 10 requests per second with a 1 kilobyte JSON that is the equivalent to 860 megabytes of data a day, or 250 copies of War and Peace every day. There is no one who can read that, so you’re just wasting money. Then, there is the case where you need to send binary data around, or you want to use a binary format instead of JSON. To do that, you must Base64 encode the data. This means that you essentially serialize the data twice — again, not an efficient way to use modern hardware.



Quote for the day:


"Managers maintain an efficient status quo while leaders attack the status quo to create something new." -- Orrin Woodward