Daily Tech Digest - February 28, 2019

Risk Based Security, the private operator of competing database VulnDB, aired their grievances with the public CVE/NVD system in their 2018 Vulnerability Trends report, released Wednesday, with charged conclusions including "there is fertile grounds for attorneys and regulators to argue negligence if CVE/NVD is the only source of vulnerability intelligence being used by your organization," and "organizations are getting late and at times unreliable vulnerability information from these two sources, along with significant gaps in coverage." This criticism is neither imaginative, nor unexpected from a privately-owned competitor attempting to justify their product. In fairness to Risk Based Security, there is a known time delay in CVSS scoring, though they overstate the severity of the problem, as an (empirical) research report finds that "there is no reason to suspect that information for severe vulnerabilities would tend to arrive later (or earlier) than information for mundane vulnerabilities."

Will Digital Banking and Cashless Economies Lead to Chaos?

Digital banking systems may be ‘convenient,’ but there is little doubt that they often fail, with the consequences of a failure being significant. On June 1, 2018, shoppers in the United Kingdom were left stranded, unable to make purchases with their Visa cards. The outage, which lasted for several hours, caused significant disruption and exemplified the problems of monopolized reliance on digital infrastructure. In another example, TSB, a leading British retail and commercial bank, recently faced scrutiny for its mishandling of the migration of its digital infrastructure, that left thousands of customers unable to access their online and mobile banking accounts for up to five days. According to the Financial Conduct Authority, financial institutions in the United Kingdom have reported a 138 percent increase in technology outages and an 18 percent increase in “cyber incidents” this year to date. ... As transactions move online, the amount of data available about one’s finances and purchasing habits increases. Does the current digital infrastructure have appropriate safeguards to protect against data breaches?

Most AI developers are now ultimately directed towards achieving a basic goal. They are charged with the responsibility of building AI models that would aptly substitute direct human efforts. This need comes in recognition to the inadequacies of human labor efforts, which are characterized by inaccuracy, inefficiency and other failures. For example, artificial intelligence has been pointed at to possess the potential for more accurate medical practices. Thus, you can be sure of a more accurate surgical procedure using this framework than is currently available by most humans. Hence, we can say that the opposites of the inadequacies of human efforts are precisely the benefits of artificial intelligence to our world. However, even though work is ongoing in significantly constructing the usefulness of this technology, truly significant achievements are yet to come. AI is all around us, but often times we don’t notice it. For instance, Facebook uses AI technology for its image recognition.

The FTC Probably Doesn't Need A New 'Big Tech' Task Force. It Just Needs To Do Its Job

While there's certainly a lot of solid complaints to be made about "big tech" giants like Facebook and Google (especially on the privacy front), it's also pretty obvious that a lot of the recent criticisms of "big tech" aren't being made in good faith. Claims of "censorship" of conservatives, for example, usually don't hold up under scrutiny, and are often driven by folks who wouldn't be facing these problems if they didn't behave like legendary assholes on the internet in the first place. Similarly a lot of the recent criticism of big tech is coming from telecom giants eager to saddle Silicon Valley giants with unnecessary regulation in a bid to hoover up a bigger slice of the online advertising pie. On the one hand, telecom giants like AT&T and Verizon just got done convincing the FCC to effectively neuter itself, leaving any remaining oversight in the lap of an FTC they know won't (and often can't) hold them accountable.

How To-Do is integrating with more and more of the Microsoft ecosystem

To-Do integration is much simpler than these older ways of connecting tasks to Outlook. Sign in to both Outlook (or Outlook.com) and To-Do with the same account and tasks and the lists you organise them into will just sync between the two tools. (If it's a Microsoft account, it has to use outlook.com, not a Yahoo or Gmail email address.) You can create tasks and mark them as complete in either app, and drag tasks from one list to another in either app. Even the emoji you can use in list names to customise the To-Do icons appear in Outlook. Under the covers, To-Do is rather like a specialised viewer for Outlook and Exchange tasks, although it doesn't support all the Outlook task features. You can only have one due date, rather than separate start and end dates; task statuses like in-progress or 25 percent complete, and details like mileage won't show up in To-Do; and you can't set task work hours, different priority levels or assign an Outlook category.

A government perspective: Tech Trends 2019

Many public organizations are finding that each individual advancement in technology—for example, blockchain, or digital reality, or serverless architecture—is powerful, but that the real power emerges when they combine. Finding jobs that new technologies can do is a first-level challenge. Finding ways to integrate a constellation of new technologies into a new operational paradigm is the next-level challenge that’s unfolding right now. Public-sector organizations have much to learn from each other. They can draw useful lessons from their counterparts in private enterprise, and indeed from other nations. Each agency is on a path toward greater digital adoption, but they’re at different places on that journey. What do they have in common? A commitment to mission-driven service.

It’s time to start some serious research into the ethics of AI

There was a general view among panellists that the need for more AI ethics research should not be read as a need for more regulation. Elisabeth Ling, managing director of researcher products and research metrics at Elsevier, said that among members of the European Commission’s high-level expert group for AI - of which she is a member - the ethics debate is, “hard and hot.” However, “There seems to be a consensus that jumping to regulation would not be the right thing to do,” she said. “We already have quite strong laws in place in Europe.” It is important to distinguish between regulating algorithms and regulating the way they are used, said Nick Jennings, vice provost for research and enterprise at Imperial College London. In the former case, “I can’t think of a sensible way in which that would make sense,” he said. But, “when [algorithms have] been trained and have data inside them and they can make decisions and they’re being used in a given application, then it is a different business.”

Why the industrial IoT is more important than consumer devices

8 surprising IoT trends to watch for in 2019
“The edge is basically any place — a wind farm, a factory — where data is generated, analyzed, and largely stored locally,” Nelson said. “Wait? Isn’t that just a data center? Sort of. The difference is the Internet of Things.” His point is that most of the vast amounts of data that is machine-generated doesn’t need to go very far. “The people who want it and use it are generally in the same building,” he noted, quoting Gartner’s prediction that more than 50 percent of data will be generated and processed outside traditional data centers — on the edge — although “snapshots and summaries might go to the cloud for deep analytics.” But Nelson wasn’t sure about what kind of edge architectures would prevail. The edge might function like an interim way station for the cloud, he noted, or we could see the emergence of “Zone” networking — edges within edges — that can conduct their own analytics and perform other tasks on a smaller, more efficient scale.

VMware offers pure open-source Kubernetes, no chaser

Unless you've been hiding under a rock in the IT world, you know Kubernetes, the container orchestration program of choice, is hotter than hot. Everyone's using it, adding on to it, offering it as a service, the list goes on and on. But VMware wants you to know that, if all you want is Kubernetes without all the fancy trimmings, well, it can give you that, too, with VMware Essential PKS. PKS includes upstream Kubernetes; reference architectures to inform design decisions; and expert support to proactively guide you through upgrades or maintenance and help you troubleshoot it if you need a hand. That's all. That's it. If that sounds familiar, well it should. Last November, VMware acquired Heptio. This company, which was founded by two Kubernetes creators, Joe Beda and Craig McLuckie, used essentially this business model. Indeed, you could argue, that VMware Essential PKS is just a new coat of pain on Heptio's previous offerings.

Monitoring and Managing Workflows Across Collaborating Microservices

In its essence, orchestration for me means that one service can command another to do something. That’s it.That’s not tighter coupling, it is just coupled the other way round. Take the order example. It might be a good idea that the checkout service just emits an order placed event but does not know who processes it. The order service listens to that order placed event. The receiver knows about the event and decides to do something about it; the coupling is on the receiving side.  It is different with the payment, because it would be quite unnatural that the payment service knows what the payment is for. But it would need that knowledge in order to react on the right events, like order placed or order created. This also means it has to be changed whenever you want to receive payments for new products or services. Many projects work around this unfavorable coupling by issuing payment required events, but these are not events, as the sender wants somebody else to do something.

Quote for the day:

He who cannot be a good follower cannot be a good leader. - Aristotle

Daily Tech Digest - February 27, 2019

Debunking five myths about process automation

Debunking five myths about process automation image
Software solutions can be costly, nearly clearing out entire IT budgets in one fell swoop and running up tabs for maintenance and service fees down the road. Often the change in productivity is not near enough to compensate for the forfeited capital, leaving management with a disappointing return on investment. Thankfully, not all systems carry a hefty price tag. Cloud-based apps have disrupted the pricing scale with rates so low even small non-profit organizations can automate workflows without breaking the budget. Saas models and cloud storage enable low, flexible monthly rates and only a small one-time start-up fee. ... Automation systems offering no-code platforms solve multiple problems at once. Since data is securely stored on the cloud, server space isn’t required. The tedious responsibilities of creating new versions, fixing bugs, and maintaining the software fall on the service provider and don’t incur additional expenses for the user; it’s all neatly packed into the monthly rate.

How to move to a disruptive network technology with minimal disruption

rotating square frames around a bridge path / future / change / transformation
Start with the open source and open specification projects, suggests Amy Wheelus, network cloud vice president at AT&T. For the cloud infrastructure, the go-to open source project is OpenStack, with many operators and different use cases, including at the edge. For the service orchestration layer, ONAP is the largest project in Open Source, she notes. "At AT&T we have launched our mobile 5G network using several open source software components, OpenStack, Airship and ONAP." Weyrick recommends "canarying" traffic before relying on it in production. "Bringing up a new, unused private subnet on existing production servers alongside existing interfaces and transitioning less-critical traffic, such as operational metrics, is one method," he says. "This allows you to get experience deploying and operating the various components of the SDN, prove operational reliability and gain confidence as you increase the percentage of traffic being transited by the new stack."

Wearable technology in the workplace and data protection law

Wearable technology is not always quite as extreme, with many employees reaping the benefits of fitness bands and smart watches. Wearable technology can also be used to help keep employees safe. For example, Oxfordshire County Council recently announced that waste recycling teams will be fitted with body cameras to deter physical and verbal abuse from the public. Whatever the technology, there will always be arguments for and against the introduction of workplace accessories, with the importance of wellbeing, safety and productivity, balanced against the adverse costs, legitimate privacy concerns, risks of discrimination and potential staff morale issues. However, given the breadth of personal data the technologies are likely to obtain, and the real risk of over-collection or that the data is used for an illegitimate purpose, the biggest adversary for wearable technology in the workplace is likely to be data protection law.

What is BDD and What Does it Mean for Testers?

The BDD approach favors system testability. I dare to say that this scheme works better with microservice architecture than with monolithic systems because the former allows adding and working on all layers independently per feature. This scheme facilitates testing as we think about the test before any line of code, thus providing greater testability. This reminds me of the Scrum Master course I took by Peregrinus, where the instructor mentioned that in Agile, the important thing is to know how to "cut the cake." Think of a multi-layered cake. Each layer adds a new flavor that is to be combined with the other layers. If the cake is cut horizontally, some people may only get the crust, some a chocolate middle layer, another a layer of vanilla, or a mix of the two, etc. In this scenario, no one would get to taste the cake as it was fully meant to be enjoyed and no one could say they actually tasted the cake.

The danger of having a cloud-native IT policy

The danger of having a cloud-native IT policy
The core benefit is simplicity. Because you’re using only the native interfaces from a single cloud provider, there is no heterogenous complexity to worry about for your security system, database, compute platforms, and so on; they are uniform and well-integrated because they sprung from the same R&D group at a single provider. As a result the cloud services work well together. A cloud-native policy limits the cloud services you can use, in return for faster, easier deployment. The promise is better performance, better integration, and a single throat to choke when things go wrong. The downside is pretty obvious: Cloud-native means lockin, and while some lockin is unavoidable, the more you’re using cloud-native interfaces and APIs, the more you’re married to that specific provider. They got ya. In an era where IT is trying to get off of expensive, proprietary enterprise databases, platforms, and software systems, the lockin aspect of cloud-native computing may be an understandably tough sell in most IT departments.

5G use cases for the enterprise: Better productivity and new business models

With 5G potentially allowing much faster download speeds and lower latency for users, he argued that a smartphone will have the same — if not better — potential for productivity as a PC. "With 5G, if you want to run something that requires a lot of processing power that's traditionally only on PCs, you'll be able to run that using the edge capability of the operator on 5G. So all of a sudden, your smartphone starts to become a more powerful platform for productivity," Amon said. Qualcomm's enthusiasm for 5G was unsurprisingly echoed by Huawei. "With a super-fast connection and with low latency, we could put a lot of things, heavy things in the cloud — in our hand," said Li Changzhu, VP of the handset product line at Huawei. The Chinese telecommunications company has used MWC to showcase its brand new Huawei Mate X foldable 5G smartphone. With a large fold-out screen and 5G connectivity, Huawei is positioning it at least in part as an enterprise productivity device.

Why AI and ML are not cybersecurity solutions--yet

Strong AI, capable of learning and solving virtually any set of diverse problems akin to an average human does not exist yet, and it is unlikely to emerge within the next decade. Frequently, when someone says AI, they mean Machine Learning. The latter can be very helpful for what we call intelligent automation - a reduction of human labor without loss of quality or reliability of the optimized process. However, the more complicated a process is the more expensive and time-consuming it is to deploy a tenable ML technology to automate it. Often, ML systems merely assist professionals by taking care of routine and trivial tasks and empowering people to concentrate on more sophisticated tasks. ... Although application security best practice has been discussed for years, there are still regular horror stories in the media, often due to a failure in basic security measures. Why are the basics still not being followed by significant numbers of businesses?

Microsoft boosts HoloLens performance, seeks corporate users

Speaking during the product’s launch, Microsoft Chief Executive Officer Satya Nadella said Microsoft had a responsibility to be a “trusted partner” for companies using its products, such as HoloLens, and that businesses and institutions shouldn’t be dependent on the tech giant. “The defining technologies of our times, AI and mixed reality, cannot be the province of a few people, a few companies or even few countries,” he said. “They must be democratized so everyone can benefit.” Unlike virtual reality goggles, which block out a user’s surroundings, the augmented-reality HoloLens overlays holograms on a user’s existing environment, letting them see things like digital instructions on complex equipment. Microsoft is focusing on corporate customers with HoloLens, and is trying to make the devices more useful right out of the box with prepared applications, rather than require months to write customized programs, says spokesman Greg Sullivan.

Security is battling to keep pace with cloud adoption

The survey found that enterprises are inadvertently introducing complexity into their environments by deploying multiple systems on-premise as well as across multiple private and public clouds. That complexity is compounded by a lack of integrated tools and training that are needed to manage and secure hybrid cloud environments. Respondents also cited a lack of integration across tools, and shortage of qualified personnel or insufficient training for using the tools, as key roadblocks to achieving cross-environment security management. While 59% of respondents said they use two or more different firewalls in their environment and 67% said they are also using two or more public cloud platforms, only 28% said they are using tools that can work across multiple environments to manage network security. 

Western Digital launches SSDs for different enterprise use cases

Western Digital launches SSDs for different enterprise use cases
The SN630 is a read-intensive drive capable of two disk writes per day, which means it has the performance to write the full capacity of the drive two times per day. So a 1TB version can write up two 2TB per day. But these drives are smaller capacity, as WD traded capacity for endurance. The SN720 is a boot device optimized for edge servers and hyperscale cloud with a lot more write performance. Random write is 10x the speed of the SN630 and is optimized for fast sequential writes. Both use NVMe, which is predicted to replace the ageing SATA interface. SATA was first developed around the turn of the century as a replacement for the IDE interface and has its legacy in hard disk technology. It is the single biggest bottleneck in SSD performance. NVMe uses the much faster PCI Express protocol, which is much more parallel and has better error recovery. Rather than squeeze any more life out of SATA, the industry is moving to NVMe in a big way at the expense of SATA. IDC predicts SATA product sales peaked in 2017 at around $5 billion and are headed to $1 billion by 2022.

Quote for the day:

"It's not the size of the dog in the fight, it's the size of the fight in the dog." -- Mark Twain

Daily Tech Digest - February 26, 2019

Are Cloud Portability and Interoperabilty Even Possible?

Are Cloud Portability and Interoperabilty Even Possible?
Interoperability, or the ability for computers to share information with each other, is the basis of cloud portability. Data and applications might be created with a specific operating system or runtime in mind. That makes it tricky to move that information to a different environment without any problems. Even if the functions of a system are the same, the foundation it’s based upon is crucial for application stability. Because it’s fundamental for cloud portability to work, cloud providers need to support interoperability between each other’s systems. Unfortunately, this rarely happens fro two major reasons: native features and the lack of standards. Just like any other kind of service, cloud providers present native features in their services. These are features that the provider specializes in, or that no other provider offers. They go beyond the basics of cloud computing and are designed to give developers additional resources for their applications.

How Cloud Computing Is Changing Schools And WorkPlaces
Technology can replace additional employee costs, minimize geographical differences and help project a professional image. Technology enhancements may require employee training to ensure that new technology devices are used correctly and seamlessly integrated into everyday business processes. Use of Evernote and Dropbox to maintain access to files and share files with anyone from anywhere without a physical presence somewhere. Integrating technology into the workplace offering increased productivity is important for staying competitive. Over the past five years, technology has rapidly changed and developed in every conceivable field. Smart phones are now able to act as standalone computer devices that can take pictures, the internet, send or receive e-mails and text messages and of course they even make phone calls. Instead of having to wait one week for files to be sent by mail, information can be transferred instantly via email or file sharing programs. Technology has made the world much smaller, especially in the business context. People from different cultures interact regularly.

Europe is prepared to rule over 5G cybersecurity

The theme of this year’s show is “intelligent connectivity”; the notion that the incoming 5G networks will not only create links between people and (many, many more) things but understand the connections they’re making at a greater depth and resolution than has been possible before, leveraging the big data generated by many more connections to power automated decision-making in near real time, with low latency another touted 5G benefit (as well as many more connections per cell). Futuristic scenarios being floated include connected cars neatly pulling to the sides of the road ahead of an ambulance rushing a patient to hospital — or indeed medical operations being aided and even directed remotely in real-time via 5G networks supporting high resolution real-time video streaming. But for every touted benefit there are easy to envisage risks to network technology that’s being designed to connect everything all of the time

IT’s Vital Role in Multi-cloud, Hybrid IT Procurement

istock 478007698cr
Dillingham suggests that organizations consider all types of deployments in terms of costs. Large, existing investments in data center infrastructure will continue to serve a vital interest, yet many types of cloud deployments will also thrive. And all workloads will need cost optimization, security, compliance, auditability, and customization. He also recommends businesses seek out consultants to avoid traps and pitfalls, which will help better manage their expectations and goals. Outside expertise is extremely valuable not only with customers in the same industry, but also across industries. “The best insights will come from knowing what it looks like to triage application portfolios, what migrations you want across cloud infrastructures, and the proper set up of comprehensive governance, control processes, and education structures,” explains Dillingham. Gardner added that systems integrators, in addition to some vendors, are going to help organizations make the transition from traditional IT procurement to everything-as-a service.

Are Frameworks Good or Bad, or Both?

A library is defined by Van Buul as a body of code, consisting of classes and functions, which is used by an application, but without being part of that application. An application interacts with the library by doing function or method calls into the library. He defines a framework as a special kind of library where interaction is the other way around, an application now implements interfaces in the framework, or use annotations from it. During execution the framework invokes application code; using a library it’s the other way around. Creating an application without using a framework is for Van Buul somewhat of a mirage, claiming that even if you just use the Java platform, you are also using a framework. He points out that the Java platform is an abstraction over the operating system and the machine and that the platform is invoking the application code. He also notes that most business applications have a web-based interface and use an abstraction layer to create entry points into the application — meaning a framework is used.

The end of Blu-ray

Why? It's all because of streaming. The numbers speak for themselves. According to the Digital Entertainment Group 2018 home entertainment report, we spent more than ever on video last year, $23.3 billion, up 11.5 percent from 2017. Of that, subscription streaming -- led by Netflix, Amazon Prime Video, and Hulu -- took the lion's share, with a 30 percent year-over-year rise to $12.9 billion. We also bought and rented another $4.55 billion worth of online movies and TV shows. Blu-ray? Even with the growing popularity of 4K Blu-ray, movie, and TV show sales only came to $4.03 billion. That's a 14.6-percent drop from 2017. At the same time, the far less expensive streaming-devices sales from companies like Rokuare growing. By NPD's count, streaming players sales from 33.3 million in November 2015 to 67.8 million in November 2018. A recent Deloitte study shows that in the past 10 years the number of households subscribing to paid streaming-video services has grown by nearly 500 percent.

Shopping for AI talent? Beware of unicorns

When executives woke up to the potential of big data, it was also at the same time we were dealing with financial collapse. Balancing between the data economy and keeping regulators at bay created a new unicorn, the chief data officer. Digital has had its own unicorn story. Adopting technology to automate, augment, and scale businesses is hard. Usher in the chief digital officer. Today, AI talent challenges inside enterprises are causing firms to assess their approach to data science while also recognizing deep data deficiencies and the impact on DevOps. The new unicorn? The AI engineer. ... Too many skills and experiences are expected from a single person. In an emerging market, to expect savants are available or findable is asking a lot. Even the digital disruptors — Amazon, Facebook, Google, or Tesla — know these roles are mythical. An early MVP for each wasn’t a market viable product; it was a proof of concept (POC). There was enough there to tell a story of what could come and what could be achieved.

Dramatic changes ahead for workplace, workforce and technology by 2025

Dramatic changes ahead for workplace, workforce and technology by 2025 รข€” global CIOs confirm image
The world of work is changing, rapidly. You don’t have to cast your mind back that far to when the World Wide Web became publicly available on 6 August 1991 — but who could have predicted the change and transformation it would herald? The internet’s eruption has catalysed the rapid change of both work and society, the business and the consumer. In this constantly morphing world we find ourselves in the workforce, workplace and the technologies that support them will be so different by 2025 ‘that enterprises need to provide global access and ensure continuous uptime now,’ according to research carried out by One Login. To remain agile and relevant, enterprises must start addressing global digital transformation strategies, including unified access management. Who says? Well, the majority of 100 CIOs from companies with at least 5,000 employees.

There's a disconnect between business and security teams

Cyber risk management: The disconnect between business, security teams
39% say they want security status reports related to major business and IT initiatives. In other words, they want to understand cyber risk as it relates to end-to-end business processes, not details about Windows PCs, DNS servers, or software vulnerabilities. Cybersecurity teams need to do a better job translating geeky data into business metrics. 36% say they want to know about the status and response associated with IT audits. This isn’t a new requirement, but business people want more than intermittent reviews; they want frequent updates that help guide timely risk mitigation decisions. To satisfy this need, CISOs must strive for continuous risk management analysis. 36% say they want reports related to vulnerabilities in their environment correlated with other data. Yes, business people care about vulnerable assets, but they don’t want to see reports detailing software vulnerabilities across thousands of systems. Rather, they want to understand if mission-critical assets are vulnerable to known exploits in the wild, so they can prioritize mitigation actions such as patching systems, segmenting traffic, and restricting access.

7 ways the new California privacy law will impact all organizations

Many/most of us have received emails or letters in the past from large companies saying that they had experienced an “unauthorized breach and your data may have been accessed and stolen.” The company further says not to worry, they are providing you with one or two years worth of free credit monitoring – and you’re welcome!” Now, CA residents can immediately bring an action against the company and be awarded damages without needing to prove actual damages. And let’s not forget that this law will be a huge opportunity for attorneys filing class action lawsuits. AB 375 raises the bar for much higher security for companies collecting or in possession of California resident data. The law also will force companies to be more aware of the consumer data they are collecting and manage that data more granularly. And preparing for the new California law (as well as the just-released GDPR) will be more complicated as other states look at adopting their own privacy laws.

Quote for the day:

"Make your mistakes, take your chances, look silly, but keep on going. Don’t freeze up." -- Thomas Wolfe"

Daily Tech Digest - February 25, 2019

Harnessing RegTech to build payments reputation

The scope and complexity of the regulatory reporting is significant and increasing year on year. This poses challenges for financial institutions not only around deciphering new legislation within strict deadlines, but also ‘translating’ it into their process framework. Every legislatorial update results in changes in regulatory rules that are imposed upon financial institutions. Banks and financial institutions spend billions of dollars every year to stay compliant. The Banks and PPI operators find it difficult to keep pace with a dramatic rise in a vast gamut of regulatory requirements and compliance obligations. According to an estimate, financial institutions could end up spending 10 percent of their revenues on compliance within the next few years. Inability to contain these costs can cause a severe dent in the profitability of Banks. Experience suggests that the inability to keep pace with the speed and scale of regulatory requirements can cause the poor governance of sensitive customer data. Often, the data is heterogeneous and not integrated.

Does Blockchain Matter Yet In Intellectual Property For Business?

For patent law, blockchain has probably been most noteworthy because of the influx of applications filed in the U.S. Patent and Trademark Office (“USPTO”) to protect various uses of blockchain technology. But blockchain’s strength as an unchangeable, distributed ledger is ideally suited to compiling information and lists. While its tamper-proof code can provide solid evidence of facts about what invention may have been created, and when, the only way to get enforceable patent rights is by filing an application with the United States Patent and Trademark Office, surviving the application (or “prosecution”) process, and having a federal patent issued in your name. That is the only way to get a legal monopoly. You can invent that “Next Great Thing,” but if you don’t patent your rights, it will be free for anyone to copy once it is out in public and you have exceeded the timetable to get a patent. Blockchain is a great way to track all of the information regarding inventions, inventor names, ownership rights and other formalities.

Expect microservices implementation to break down IT norms

Microservices broaden the developer's responsibilities to include everything from network communications to business interactions. "J2EE developers got away with a lot," Morgenthal said. "With an opinionated platform, you'd design to the spec and deploy your file, and ops would be responsible for that entire resilient infrastructure to run it." To implement microservices, developers must understand the language, the container and how to automate that container's creation. They should perform some tests and know how network communications work and which ports to open, Morgenthal said. Database and storage demands also fall under the developer's purview, such as how to ensure separation of service and files. But before they design a single microservice, developers must grasp how the architecture all fits together. "Don't overload microservices with capabilities," Morgenthal said. "Do one thing well." Microservices should fit into appropriate business domains and be organized by business logic. Component changes should not break any dependency models upon which applications work.

Modernizing the Data Warehouse

The primary DW interest is to have a full set of validated data about the business that can be used for many use cases. While a simple need statement, it represents many difficult fields in technology, organizational dynamics, and work activities. The adjective “validated” itself it is a major and long-term challenge to solve. How do know it is valid? Is it valid for all reports and queries? Will it be valid when new data is added? How can you find and correct errors? Can the technology store the breadth of data desired, parsing it all, cyclically interrogating for model-based computing or Machine Learning, and ad doc querying multiple levels of data over multiple time frames? These are the activities needed and which could be done by a large number of trained people, such as actuarial work before widespread use of computers . Until recent years, major technology advances were logarithm tables, the slide rule, and mechanical calculators starting in the 1600s. The first wave of data warehousing arose when computing technology shrunk in size and cost from the early behemoths into feasible business tools.

Foldable phones could finally push office workers away from the PC

Part of the answer has to do with the resilience of the PC, which Bill Gates and Steve Jobs famously discussed during the D5 conference in 2007. During an interview, the pair described a world with general-purpose computing devices as well as specialized, service-dependent post-PC devices. "This general purpose device is going to continue to be with us and morph with us, whether it's a tablet or a notebook or, you know, a big curved desktop that you have at your house or whatever it might be," said Jobs. We may eventually enter the fully post-PC world that Perlow predicted in 2012 where "the majority of business professionals will be using extremely inexpensive thin notebooks, tablets and thin clients (sub $500) which will utilize any number of software technologies that run within the browser or will use next-generation Web-based APIs..," but we're not there yet. Perlow gives us another part of the answer in a 2011 ZDNet article.

Cloud IDE review: AWS Cloud9 vs. Eclipse Che vs. Eclipse Theia

Cloud IDE review: AWS Cloud9 vs. Eclipse Che vs. Eclipse Theia
Eclipse Che is an open source developer workspace server and cloud IDE designed for teams and organizations. Che version 7, currently in beta, uses Eclipse Theia as the basis of its IDE. Older versions of Che use a GWT-based IDE. Che workspaces run in containers on Docker, OpenShift, or Kubernetes. You can run Che in the public cloud, a private cloud, or install it on any operating system. Che has been tested on Ubuntu, Linux, MacOS, and Windows. You can also run Che in a self-service workspace hosted at https://che.openshift.io/, for which you’ll need to have or create a free OpenShift or Red Hat log-in. In addition, Eclipse Che comprises the core of Red Hat CodeReady Workspaces, the new development environment for OpenShift. In addition to being supported by Red Hat, CodeReady Workspaces have pre-built stacks with supported Red Hat technologies and include Red Hat SSO to handle authentication and security between the developer teams.

US may cut off countries that use Huawei in 5G networks

“As a supplier, Huawei cannot decide how much Huawei equipment will be deployed in the UK. Also we cannot make decision on customer’s behalf whether they should choose Huawei or not,” said Ding. “However, I can tell you the multi-operator video call we made yesterday was based on live networks.” “Also, in the past few years, we have had extensive and in-depth collaboration and innovations with operators on deployment and standards of 5G. Third, I firmly believe a 5G market without Huawei is just like the English Premier League without Manchester United.” In remarks made on social media on 21 February, Pompeo’s boss, Donald Trump, said he wanted 5G, and “even 6G” networks operational in the US as soon as possible. He urged US networking suppliers to step up their game in the race to innovate around 5G, an area where Huawei is particularly strong. Implying that he concedes Huawei does have a substantial technological lead, 

Remote Assist is just one of a handful of Microsoft-developed, Dynamics 365-branded HoloLens apps targeting this market. Others include Dynamics 365 Layout, a 3D layout app also introduced last year; Dynamics 365 Product Visualize for showcasing products in 3D, which Microsoft announced last week; and a coming Dynamics 365 Guides application for 3D training at scale. Last week, Microsoft officials also announced that a Dynamics 365 Remote Assist app will be coming to Android devices and Dynamics 365 Product Visualize app will be coming to iOS devices. What officials didn't explain at that time is how users on those mobile phones will be able to see holograms at high fidelity. This is where Azure comes into play. Microsoft is introducing two new Azure services to enable more intelligent cloud-intelligent edge scenarios. They'll initially be available as previews. Currently, holographic applications can share spatial anchors between themselves, enabling users to render a hologram at the same place in the real world across multiple devices.

Get ready for the age of sensor panic

Smile, you are on camera
The consternation over airplane cameras and security system microphones represents a new phenomenon I call “sensor panic.” And I believe this is just the beginning. The companies both responded to the criticism with the same basic claim: The sensors were installed for some future purpose, but to date they have been non-active. The companies also seemed to be surprised that anyone would freak out over the existence of sensors that weren’t being used for anything. But after what seems like daily reports about Facebook privacy transgressions, Russian hacking, Chinese industrial espionage, Android malware and all manner of leaks, hacks and privacy-invading blunders, we’ve entered into a new era of public distrust of all things technological. For example, can Singapore Airlines and Google be trusted to not use their cameras and microphones given that they couldn’t be trusted to even disclose their existence? (And by the way, it’s been just a couple of weeks since it was revealed that Singapore Airlines may have been secretly recording user activity on the company’s iPhone app.

There is an ongoing and significant risk to DNS infrastructure

Cryptographically signing DNS records can prevent unauthorized third-parties from modifying DNS entries without a private DNSSEC signing key that's usually in the possession of the legitimate domain owner only. ICANN officials said DNSSEC would have prevented the recent DNS hijacking attacks that have made headlines in the past two month. At the start of the year, US cyber-security firm FireEye revealed a months-long campaigncarried out by Iranian threat actors who hacked into the web hosting and domain registrar accounts to change the DNS records of email domains belonging to private companies and government entities. This attacks --called DNS hijacking-- allowed the crooks to redirect legitimate traffic to their own malicious servers, where they performed man-in-the-middle attacks to intercept login credentials and then forwarded the traffic back to the legitimate email servers.

Quote for the day:

"Every right that we enjoy has a corresponding duty to not interfere with the enjoyment of the same right by others." - Orrin Woodward

Daily Tech Digest - February 24, 2019

AI and OCR: How optical character recognition is being revitalised

AI and OCR: How optical character recognition is being revitalised image
OCR tools are undergoing a quiet revolution as ambitious software providers combine them with AI. As a consequence, data capturing software is simultaneously capturing information and comprehending the content. In practice this means that AI tools can check for mistakes independent of a human-user providing streamlined fault management. But how do these tools work? The answer is slightly different depending on which AI platform you’re is using. One detailed case study of how AI is used to enhance OCR can been in Infrrd’s work with a global investment firm. Infrrd IDC, a hybrid AI and OCR tool was used to help manage financial reports. The tool was used to copy financial reports from various languages and translate them into English. To do this, Infrrd used a combination of machine learning and Computer Vision algorithms. These algorithms were used to analyse document layout during pre-processing to pinpoint what information was to be recorded.  

VMware’s ongoing reinvention

power best of the best rule the world
According to analysts, it’s been clear for some time that the server virtualization market is approaching a saturation point. Gartner reported that license revenues for x86 virtualization declined for the first time ever in the first quarter of 2016, with most enterprises reporting data-center virtualization levels of 75% or higher. And by 2017, Gartner declared the server-virtualization market so mature that it stopped doing its annual server-virtualization Magic Quadrant reports altogether. Meanwhile, the threat to VMware goes beyond companies having virtualized pretty much every workload that can be virtualized. In a bid to reduce capital expenditures and increase business agility, organizations are trying to downsize their data centers and shift existing workloads to the cloud, either on SaaS platforms or cloud infrastructure from AWS or Azure. And as companies decide to go cloud-native for all new applications, they are turning to cutting-edge approaches like containerization, micro-services and serverless computing, which don’t require a traditional VM.

How Enterprise Architects Can Contribute to Innovation

First of all, enterprise architects must take a future-oriented perspective. Innovation requires more exploration and risk-taking than architects are typically used to. In many organizations, their chief responsibility is to keep track of the complexities of the current situation, with the purpose of finding opportunities for local improvements, risk reduction or cost savings. Established architecture methods and practices are often aimed at staying in control. You also see this in the terminology used. Often, architects want to design a ‘future-proof’ architecture or system that will easily accommodate any potential future requirements. However, in a volatile environment there is no such thing as a future-proof solution. The only thing you can do is to design something for change, ensuring that change itself is, and stays, as easy as possible. There are trade-offs to consider here. If you want to maximize cost efficiency, sharing resources across your enterprise may be a good idea. 

5G-Ready Network Today Requires a Secure, Automated Cloud Architecture

5G will shatter the current 4G speed limitation, increasing it by up to 1,000 times enabling 8K video applications, or allowing rural subscribers to enjoy the same Internet experience as their urban counterparts. 5G will also drastically lower network application latency from hundreds of milliseconds to just a few—we’re talking single digits—giving rise to near real-time machine-to-machine interaction currently found only in science fiction movies. Surgeries will be performed from the other side of the globe, while fatal car crashes could be virtually eliminated. Fully autonomous robotic factories could request maintenance before any failures occur, while a fleet of drones could apply pesticides to crops with surgical precision. Just as Albert Einstein pushed humanity’s understanding of physics to new heights, 5G will push mankind to achieve new speeds as latency drops and drive the number of connected devices beyond anything previously imagined.

Scaling RPA: before automating processes, improve them image
According to Christopher, it appears that in the rush to adopt RPA, enterprises may not be taking an integrated approach to automation and are failing to comprehensively examine processes before they automate them. “I’ve heard lots of enterprises actually admit this,” said Christopher. “Before one of our recent roundtables on intelligent automation, I was going around the room talking to different business leaders and lots of them admitted how when they were starting out they’d look at processes and say ‘let’s just swap in some automation’, then they realise they’re just left with a different form of a worker doing the same job.” It seems enterprises are leading with a solution before identifying the problem. “Automation should be seen as an opportunity to drive dramatic process improvement,” added Christopher. This, of course, is no mean feat. Let’s say, for example, you’re a multi-national producer, and you want to improve your order to cash process. From order entry all the way through to the delivery of goods and receipt of payments, it’s a huge project. 

The Biggest Threat To Banks Isn't Fintech Or Big Tech--It's The Government

What bankers should be worried about is the government--not fintech and Big Tech firms. Specifically, politicians who have no idea: 1) How the banking system works, and 2) What the difference between a Main Street bank and Wall Street bank is. It's more than just potential regulatory changes that threatens banks, however (not that what some of these politicians want to propose won't be painful). The problem is that it's taken the banking industry roughly 10 years to rebuild its standing with consumers (not counting the one west coast bank that seems to do everything in its power to keep its reputation in the tanks). For 10 years I've said that banks wouldn't be in the clear until a new villain came along (you probably don't remember that it was banks, on the heels of the financial crisis, who saved British Petroleum from being the most hated villain after the Gulf oil spill). With the data abuses by Facebook (the British government calls the company "digital gangsters"), and news that Amazon paid no taxes--again--Big Tech is becoming the new villain.

Top Fintech Trends Revamping Financial Technology

Top Fintech Trends Revamping Financial Technology
Platforms as a service or PaaS is one of the biggest trends to look out for in the Fintech space. This will allow solutions to go beyond the cloud computing arena. The companies can extend solutions out of the box and add smart customization to satisfy diverse industry needs. Diverse functions like Sending and receiving payments, advanced payment services, infrastructure building and enhancement, unconventional and new user experiences are the future of new collaborations in the Fintech Industry. According to data sourced in late 2018 from the World Bank, India houses the second largest unbanked population in the world. This is a clear indication that the Government needs to look at non-traditional channels that have the ability to drive change and impact the economy favourably. Indian Fintech companies have been instrumental in creating lean cloud-based solutions to reach out to the masses.

Warner questions health care groups on cybersecurity

Sen. Mark Warner (D-Va.) sent a letter to several major health care groups on Thursday asking what they have done to prevent cyberattacks and how the federal government can help them address cyber issues. “The increased use of technology in health care certainly has the potential to improve the quality of patient care, expand access to care (including by extending the range of services through telehealth), and reduce wasteful spending,” Warner wrote in the letter, according to a release. “However, the increased use of technology has also left the health care industry more vulnerable to attack.” Warner, the vice chair of the Senate Intelligence Committee and co-chair of the Senate Cybersecurity Caucus, cited a Government Accountability Office report that found that more than 113 million health care records were stolen in 2015 through cyberattacks. The letter was sent to organizations like the American Hospital Association, the American Medical Association, the National Rural Health Association and the Healthcare Leadership Council.

FinTech-As-A-Service Eyes Global Payments Simplification Makeover

Rapyd FinTech as a service
Only a decade and a half ago, companies still had their own data centers. Now, cloud computing has made virtual clusters of computers and storage available to those same firms, and delivery of those services is concentrated among companies like Amazon, Google and Microsoft. “Now,” said Shtilman, “nobody thinks about building this stuff out on their own. They go to these large platforms, and these platforms give you what you need, globally, in any data center — in Singapore, in Europe, in China. We believe that, five years down the road, this is what is going to happen in FinTech,” through a flexible and multi-faceted platform geared toward helping companies — small and large — offer services across geographies. At present, with multi-currency support across 65 holding currencies and 170 payout currencies, Rapyd’s fund collection offerings include cards, cash (which the CEO noted is “still king” in many countries), bank transfers and local eWallets. Fund disbursements include push-to-card and local eWallet options.

Debugging Microservices Running in Containers: Tooling Review at KubeCon NA

The Rookout team describe their breakpoint functionality as "non-breaking breakpoints", as the corresponding application execution does not actually pause or halt as it would with a traditional active debugger. They also state that "no restarts, redeployment or extra coding is required" in order to set these breakpoints, and this can lead to very quick hypothesis testing and bug detection. As a result of a Rookout breakpoint being hit within an application's execution flow, an engineer can view stack traces and global variable values, as well as specify individual variable "watches". InfoQ learned that in the case of Java debugging, the underlying mechanism that provides the breakpoint functionality is based on java.lang.instrument, which allows Java programming language agents to instrument programs running on the JVM. The instrumentation of an application is accomplished by adding a Rookout dependency to the codebase e.g. via Maven or Gradle.

Quote for the day:

"Money can't buy happiness, but it can make you awfully comfortable while you're being miserable." --  Clare Boothe Luce

Daily Tech Digest - February 23, 2019

Why Not All FinTech Providers Are FinTech Firms

Provenir On Why Not Every Financial Technology Provider Is A FinTech
“There’s often a disconnect between what a business shows and what’s going on underneath, especially when it comes to technology. It’s almost like an iceberg: The client sees a small, nimble solution, but underneath looms a technology monolith that’s difficult to turn and slow to get around,” Thomas said. He also weighed in on the role financial technology vendors have to play in the future. “As technology companies, we have to acknowledge that we’re playing a leading role in how the industry develops. How can we expect banks to transform if they’re relying on technology dodos instead of agile, forward-thinking FinTechs?” The central tenet here, for all traditional financial services firms, including technology vendors, is recognizing the need not just for change, but a shift to a more agile approach where they can deliver products with the same speed as their FinTech competitors. “It’s no longer good enough to talk the FinTech talk; you have to be able to walk the walk,” said Thomas.

Misconfiguration Leads to Major Health Data Breach

The misconfigured database at UW Medicine was the result of a coding error when data was being moved onto a new server, a UW Medicine spokeswoman tells Information Security Media Group. The organization is not offering free credit or ID monitoring services because the exposed files contained no Social Security numbers, patient financial information or medical records, the spokeswoman says. The files contained protected health information that UW Medicine is legally required to track to, for example, comply with Washington state reporting requirements, the statement says. The exposed information included patients' names, medical record numbers, and a description and purpose of the information shared for regulatory reporting purposes. "The database is used to keep track of the times UW Medicine shares patient health information that meets certain legal criteria," the statement says. The most common reasons involve situations where UW Medicine is required by Washington state law to share patient information with public health authorities, law enforcement and Child Protective Services, the organization notes.

On the future of blockchain and its impact on banking

Quoting his own understanding of blockchain, Balakrishnan says, "Trade finance is the only justified use case which will give a RoI, where people will allow to be impacted, as it being a genuine problem across." ...  As far as payment companies are considered, the barrier to them adopting blockchain would be the legacy systems that have to undergo major change to shift on a newer platform, but a visionary would pave way for it. The lack of standardization across organizations will ensure that the adoption and change will happen in the banks first as an efficiency mechanism and then play it out in other segments. Another avenue would be the consortium lending as Balakrishnan explains,"Multiple banks can come together and look at consortium lending, with assets being clear, reducing frauds, a typical NPA story, a TPA account where money should flow through--can we build a mechanism where all of us could access it and the primary lending institution can play the role of a conveyor and the rest of the stuff is available to us."

BlackBerry acquires Cylance to cement security capability

Today, BlackBerry took a giant step forward toward our goal of being the world’s largest and most trusted AI [artificial intelligence]-cyber security company,” said John Chen, executive chairman and CEO of BlackBerry. “Securing endpoints and the data that flows between them is absolutely critical in today’s hyper-connected world. By adding Cylance’s technology to our arsenal of cyber security solutions, we will help enterprises intelligently connect, protect and build secure endpoints that users can trust.” Cylance’s machine learning and AI technology is a strategic addition to BlackBerry’s end-to-end secure communications portfolio. In particular, Cylance’s embeddable AI technology is expected to accelerate the development of BlackBerry Spark, the secure communications platform for the internet of things. Designed for ultra security and industry-specific safety certifications, such as ISO 26262 in vehicles, BlackBerry Spark taps into the company’s existing security portfolio of technology that includes FIPS-validated, app-level, AES 256-bit encryption to ensure data is always protected.

Most popular programming language frameworks and tools for machine learning

More than 1,300 people mainly working in the tech, finance and healthcare revealed which machine-learning technologies they use at their firms, in a new O'Reilly survey. The list is a mix of software frameworks and libraries for data science favorite Python, big data platforms, and cloud-based services that handle each stage of the machine-learning pipeline. Most firms are still at the evaluation stage when it comes to using machine learning, or AI as the report refers to it, and the most common tools being implemented were those for 'model visualization' and 'automated model search and hyperparameter tuning'. Unsurprisingly, the most common form of ML being used was supervised learning, where a machine-learning model is trained using large amounts of labelled data. For instance, a computer-vision model tasked with spotting people in video might be trained on images annotated to indicate whether they contain a person.

Calculating Quantum Computing's Future

The most popular approach to quantum computing uses superconducting electronic circuits, piggybacking on the foundations of the semiconductor industry. Whereas ordinary computers encode information as silicon-inscribed bits, either “zeros” or “ones,” quantum computers use quantum bits, or “qubits” (pronounced cue-bits). These particles, weirdly, inhabit multiple states at once. To keep them in flux, they must be kept isolated and cold. Very, very cold. “What you’re looking at is the world’s most expensive refrigerator,” says Bob Sutor, head of quantum strategy at IBM, while gesturing at a 20-qubit quantum computer that company unveiled in January. Despite its small size, Rigetti, founded by a physicist who previously built quantum computers at IBM, believes it can challenge the titans. The company sells a quantum computing cloud service to researchers who are racing to be the first to achieve “quantum advantage,” when a quantum computer outperforms a traditional one.

Big Data, AI & IoT, Part Three: What's Stopping Us?

Hurdles to AI, Big Data, IoT growth
This series of articles has looked at the promise of Big Data, AI, and IoT, and how they all make up one ecosystem. So after looking at the benefits of these technologies in specific environments, it is worth a review of the obstacles faced before they will realize their full potential. Business leaders and media outlets alike have begun to clamor around the promise of AI, Big Data and IoT as if they are a magic bullet that will solve the world’s problems. But no technology exists in a vacuum, and the potential impact of these technologies is currently mitigated by barriers such as standardization, a lack of understanding, and unrealistic expectations at the top of many organizations. Wading through these issues is a challenge, but businesses, enterprises and governments are starting to realize that cooperation and steady progress will bring a quicker win than rushing in head first. Any new technology faces a host of issues in development and rollout, but looking at the current IoT landscape can shed some light on the challenges facing adopters of these particular emerging technologies.

Criminals, Nation-States Keep Hijacking BGP and DNS

Criminals, Nation-States Keep Hijacking BGP and DNS
DNS is also being abused for cyber espionage. In November 2018, Crowdstrike said it had spotted such a campaign targeting government domains in Lebanon and the United Arab Emirates. "We are naming it DNSpionage due to the fact that it supports DNS tunneling as a covert channel to communicate with the attackers' infrastructure," Crowdstrike said. In January, FireEye documented a global DNS hijacking campaign "that has affected dozens of domains belonging to government, telecommunications and internet infrastructure entities across the Middle East and North Africa, Europe and North America," possibly sponsored by Iran. As security blogger Brian Krebs has reported, one problem with attacks that utilize DNS is that few companies monitor for malicious DNS changes. Woodward says that's a problem with BGP hijacking as well. While large, well-resourced organizations may quickly spot any such hijacking, service providers in small countries may not.

Post-Breach HIPAA Enforcement: A Call for 'Safe Harbors'

Post-Breach HIPAA Enforcement: A Call for 'Safe Harbors'
Among its other breach-related suggestions, CHIME also recommends "amending [HIPAA] language around the responsibilities of business associates by adding that for breaches that start with them they must bear responsibility." That includes notification of media and breach reporting to HHS. Under the current HIPAA rules, covered entities are responsible for notification of breaches by their business associates. The AHA offers similar safe harbor suggestions. "Despite complying with HIPAA rules and implementing best practices, hospitals and healthcare providers will continue to be the targets of sophisticated cyberattacks, and some attacks will inevitably succeed," AHA writes. Whether exploiting previously unknown vulnerabilities or taking advantage of an organization with limited resources, attackers will continue to be successful, AHA notes. "The AHA believes that victims of attacks should be given support and resources, and enforcement efforts should rightly focus on investigating and prosecuting the attackers," AHA writes.

Uber Open-Sources Ludwig Code-Free Deep-Learning Toolkit

Ludwig is built on top of Google's TensorFlow deep-learning library. There are other "wrappers" of TensorFlow that provide friendly interfaces, such as Keras or Gluon. However, these still require users to define their neural networks by writing code (usually Python). Ludwig pre-packages a large number of popular deep-learning patterns, which can be combined and configured using a YAML file. A large class of deep-learning solutions for vision and speech problems follow an "encoder/decoder" pattern. In this pattern, the input is converted from raw data into a tensor representation, which is then fed into one or more layers of a neural network. The layer types depend on the input data. For example, image data is often fed into a convolution neural network (CNN), while text data is fed into a recurrent neural network (RNN). Likewise, the output if the network is converted from tensors back into output data, often passing through RNN layers (if the output is text) or some other common layer type.

Quote for the day:

"The test we must set for ourselves is not to march alone but to march in such a way that others will wish to join us." -- Hubert Humphrey