Daily Tech Digest - March 03, 2019


You need to methodically identify the possible risks that could face your start-up. You might want to think outside the box for this one because anything could, and will, happen so you better have an answer for any risks you have identified. Next you need to assess the likelihood that the risks will happen and understand how to respond. You might want to put them in a list from most likely to almost impossible. Unless you live in Tornado Alley you may consider a natural disaster as a long shot. Arlene Dickinson, the Canadian Dragon’s Den maven may disagree. She went to work one day after a brutal storm to find that the basement in her tony Toronto office had water damage. No, this wasn’t a leaky roof, her entire basement where she stored client records, computers, taxes and more was filled to the ceiling with smelly, dirty water. Everything was ruined but she did have insurance. When you see the types of risks you will have to come up with your own way to mitigate them.


A golden Bitcoin sits on a printed circuit board as a concept for cryptocurrency and blockchain technology. (Photocredit: Getty).
No single central entity stores and processes payments or manages the admission of new units into the database, thus ensuring freedom of access. This is the key difference of DAG over its predecessors. In centralized systems, only one party was allowed to add transactions to the ledger, while in blockchains, only a select few - the miners - are allowed to do it. And, in DAG, everybody is allowed to write to the ledger. DAG also improves speed and throughput. So, instead of having one single chain of blocks, data can be added to any number of parallel interconnected “lanes.” One can think that it would be challenging to keep this thing together while everybody is allowed to write to the ledger at the same time, which is right. This is what consensus algorithms are about and it is currently an area of active research where some of the brightest minds are involved. Simplified, the intuition behind Obyte’s consensus algorithm is as follows: when a user adds a new transaction, it is placed on the ledger together with addresses of twelve witnesses.


Dongle Danger: Operating Systems Don't Defend Memory
The weaknesses, collectively called Thunderclap, highlight a new class of threats posed by malicious peripherals. The research has been in the works since 2016, and Apple is one of several vendors that have issued software updates as a result. The work focused on the Thunderbolt 3 data transfer standard over USB Type-C connectors. Although operating systems are supposed to only allow a peripheral to have direct memory access for the resources it needs, researchers found that this defense isn't implemented effectively to prevent data theft. The research also covered PCI Express, or PCIe, an older set of device connection and data transfer protocols. Stealing data this way would require physical access to a device. "The combination of power, video and peripheral-device DMA over Thunderbolt 3 ports facilitates the creation of malicious charging stations or displays that function correctly but simultaneously take control of connected machines," the researchers write in a blog post.



The approach has the potential not just to diversify tech but to help “techify” everything else, said Megan Smith, former CTO for the Obama administration: “We could really work on ... the hardest problems together in this collaborative way.” Faculty at the new college will work with other MIT departments to cross-pollinate ideas. Classes will also be designed so that technical skills, social sciences, and the humanities are bound up together within each course rather than learned separately. “It’s not just thinking about how you learn computation,” Melissa Nobles, the dean of MIT’s School of Humanities, Arts, and Social Sciences, told MIT Technology Review after the main-stage event, “but it’s also students having an awareness of the larger political, social context in which we’re all living.” This has also been my driving mission with MIT Technology Review’s AI newsletter, The Algorithm: to dismantle our outdated notions that technology is for the tech people and social problems are for the humanities people; that there is such thing as a “math person,” which is certainly not the same as a “people person.”



Psychologists and researchers have developed a systematic approach for discovering a sustainable solution to any problem. This technique, commonly referred to as the problem-solving cycle, starts with identifying the problem. After all, there could be multiple issues within one situation, and you could be focusing on the wrong one. Separate the symptoms from the cause. After defining the problem, form a strategy. This will vary depending on the situation and your preferences, but develop wide-ranging ideas while taking into consideration your resources. Are the solutions feasible? Come up with multiple ideas to have options. Organize your information: What do you know -- or not know -- about the problem? By collecting as much information as possible, you increase your chances of achieving a positive outcome. Once you settle on a solution, monitor its progress. The solution you developed should be measurable so you can assess whether it's reaching its destination. If not, you may need to implement an alternative strategy.


evolve-business-by-zach-meyers.jpg
Kimball noted that when they started the company, they weren't yet sure where CockroachDB would fit into the ecosystem, or what kinds of companies would be willing and able to move to a new RDBMS. He went on to add, however, that in 2018 they began to answer those questions and ended with an impressive first year of revenue:  "It turns out that much of the Fortune 2000 is struggling with often board-level mandates to embrace the benefits of the public cloud. That modernization process opens the door to consideration of alternatives to Oracle, especially databases better suited to exploiting the opportunities inherent in the cloud.  Where CockroachDB has a big strategic advantage over the likes of AWS Aurora or Google Cloud Spanner is that we offer a bridge from the reality of existing on-premise deployments to the desired outcome of using the public cloud wherever it makes sense. CockroachDB can be run on-premise, hybrid, and across arbitrary cloud vendors."


big data experts love data
“First, data is a rich source of insights and discovery about any domain, to understand deeper the things that we already know about the domain and to discover new things that we did not know about it. Second, data is the fuel (the essential input) for interesting algorithms and models that can be used to help predict the future, to optimize outcomes, to reveal emerging trends, and to detect anomalies, sometimes before they happen. Third, data is sensory input to our natural human activities of pattern detection and pattern recognition that become the basis for nearly all human decisions and actions as we move forward through our world. Fourth, data are measurements that encode knowledge – as such, data delivers a wonderful very human challenge to us to decode that knowledge and consequently to become smarter and wiser about people, processes, events, and all things. Finally, data ignites innovation, transformation, and value creation in all organizations and businesses through pattern exploration and pattern exploitation within the digital signals that flow all around us.”


Dow Jones Data Exposed on Public Server
Bob Diachenko, an independent security researcher, discovered that an Amazon Web Services-hosted Elasticsearch database exposed the records, TechCrunch first reported. The exposed data, which has since been secured, is Dow Jones' Watchlist database, which companies use as part of their risk and compliance efforts. Dow Jones says in a statement that "an authorized third party" was to blame for the exposure, but it did not name the company. Dow Jones declined to provide further details on the incident. Security researchers say the incident highlights the need for adequate vendor risk management. A recent Verizon report found that one of every two data breaches stems from third-party risks. Too many organizations focus on protecting their own IT infrastructure, ignoring the security of data handed over to a third party, security experts say. "This becomes a major issue because you are as vulnerable as your vendor managing your data," says Edwin Lim, director of professional services - APJ, at Trustwave, a Singtel company.



By creating a seemingly innocent application that holds a malicious exploit script, potential attackers can dupe users when the app asks for permission to access the external storage. A typical user is likely to approve the request, enabling the attacker to manipulate the data written on that storage and used by multiple applications. App development guidelines urge developers not to have their apps store sensitive code in the external storage, though our researchers found that many apps, including Google Translate, did not heed this advice. However, while security-related guidelines are great, frankly, it’s naïve to expect every developer in the world to have security top of mind when they write their code, let alone to have enough expertise to get it right. Google patched their applications that were affected by this particular vulnerability, as responsible companies do, but it goes to show that identifying just one entry point is enough to keep attackers in business.



The only thing constant is change. And, no matter what the size of an enterprise is, over the last few decades, businesses across the world have witnessed tremendous changes in the ways they operate and run. One of the major factors influencing this change is, undoubtedly, the technological explosion in all aspects of our lives. From the gigantic machinery that man started out with, the ones with heavy knobs and loud motors, to sleek tablets and microchips with tremendous computing and processing powers, the application of science and technology in businesses has brought about drastic changes, and mostly for the better. Since technology stepped into businesses, a spike in the production of software, programs, applications, and interfaces all designed exclusively for businesses to collaborate with teams, manage data, and derive insights from sales made, have been flooding the market. Today, we have a new class of businesses called ‘digital businesses’ that heavily rely on the Internet for their everyday functionality.



Quote for the day:



"A leader must have the courage to act against an expert_s advice." -- James Callaghan


Daily Tech Digest - March 02, 2019


In the wildest dreams of enthusiasts, these devices will be a gateway to something called the decentralized web, or “Web 3.0.” In this future version of the internet, blockchains and similar technologies would support decentralized applications—“dapps”—that look and feel like the mobile apps we use today but run on public, peer-to-peer networks instead of the private servers of big tech companies. It’s widely thought that a major impediment to mainstream adoption of cryptocurrency and dapps is that these technologies are too difficult to use for people who are not especially tech savvy. Better user experiences, starting with cryptographic key management, could change that. But getting there is not straightforward, given that key security is paramount: you lose your keys, you lose your assets. This also explains why Ethereum creator Vitalik Buterin seems so excited about one particular feature of HTC’s Exodus 1, called social key recovery. Essentially, users can choose a small group of contacts and give them parts of their keys.



Q&A with Dominic Harvey, director at CWJobs: Plugging the tech skills gap

The tech sector is notorious for experiencing high staff turnover. Tech workers regularly switch jobs to climb the career ladder or have the chance to get their hands on the latest equipment, in turn taking their training and specialist skills with them. This can be seen as problematic for companies across the board, but the spreading out of great talent and important skills throughout the sector means an overall boost to the UK’s tech capability. Bigger companies should also seriously consider tech apprenticeship schemes as a long-term solution to addressing skill shortages, with this presenting the opportunity to train young people in the sector and the company from the ground up. So rather than viewing this trend as negative, businesses need to accept this is part of the industry and use it as an opportunity to equip the next generation of talent. I would encourage tech firms to focus, not just on investing in the latest equipment, but in creating a culture that sees employees leave on good terms, with the potential for them to return with an even greater skill set than when they left.


New chemistry-based data storage would blow Moore’s Law out of the water

New chemistry-based data storage would blow Moore’s Law out of the water
Ultra-miniaturization, using chemistry and its molecules and atoms, has been on the scientific community radar for a while. However, it’s been rocky—temperature has been a problem, among other things. One big issue, which may be about to be solved, is related to controlling flowing electrons. The flowing current, acting like a wave, gets interfered with—a bit like a water wave. The trouble is called quantum interference and is an area in which the researchers claim to be making progress. Researchers want to get a handle on “not only measuring quantum phenomena in single molecules, but also controlling them,” says Nongjian "NJ" Tao, director of the ASU's Biodesign Center for Bioelectronics and Biosensors, in the article. He says that by figuring the charge-transport properties better, they’ll be able to develop the new, ultra-tiny electronics devices. If successful, data storage equipment and the general processing of information could end up operating through high-speed, high-power molecular switches. Transistors and rectifiers could also become molecular scale. Miniaturization-limiting silicon could be replaced.


Create an IT support process to take on any outage


Issue management platforms can initiate post-mortems as a follow-up action to a major or critical issue. The tool supplies a detailed log of the incident response timeline and actions/results for review. Post-mortems focus on root causes rather than proximate causes. Proximate causes are the reasons or triggers that started the issue. A root cause is the central fault that, if corrected, could prevent all such incidents. For example, an application throws an error because its volume runs out of storage. The application error is the proximate cause of the issue, but the root cause is a lack of monitoring of logical unit number (LUN) usage and remaining capacity. The post-mortem evaluation might result in new storage monitoring that triggers an alert when the LUN hits 85% full. With that fix in place, administrators can add storage before an application error ever occurs. Similarly, a post-mortem could inform a decision to upgrade systems or software.


Equifax CTO: 3 keys to leading culture change

CIO Hands Collaboration
Tech leaders need to recognize, and embrace, the biggest barrier to technology plans isn’t whether the technology will work – it’s the organizational culture. Ten or 15 years ago, the danger was in the details of knowing whether or not Vendor A could interoperate with Vendor B. Today, we don’t have those same technology barriers; instead, people and cultures have become our largest challenges regardless of the size of the company or industry. Encouraging teams to think and act differently requires an invitation to have them join you in a different way of working. Whether or not they accept that invitation is up to them, but it’s up to you to make decisions based on their response. Change must occur – people either want to change or you need to change the people. Look at your team and your people, your own cadence, your own style, and your own words. We have to be intentional about the way we work, the way we talk, the way we interact and the types of people we hire.


Node.js JavaScript vs PHP: Which programming language is winning over developers?


The growing popularity of Node.js JavaScript was captured by 2018 Node.js User Survey , which also shed light on how the language is being used. "Node.js continues to see it's popularity grow on every continent and in a very broad set of use cases due to its flexibility and utility for a wide variety of use cases", it stated, with web apps being the most popular use case, followed by enterprise apps. Of course, the problem with the surveys above is they are canvassing developers who work primarily with JavaScript, and who as a result may be more likely to choose Node.js at the backend. It's also not necessarily the case that firms either wholly use a single back-end language. Organizations may use Node.js JavaScript for some sites and services and PHP on servers supporting other sites. That dual-use is borne out by the surveys, with almost one third of developers saying they used PHP alongside Node.js in the 2018 Node.js user survey, while respondents to the Vue.js survey also reported using a variety of languages at the backend.


ROX Is the New ROI: Prioritizing Customer Experience


PwC’s 2019 Global Consumer Insights Survey — the results of which will be published soon — highlights the need to focus on both ROI and ROX. For example, we asked more than 21,000 consumers in 27 territories around the world what they thought was the most influential type of advertising. About 35 percent said traditional TV ads, the highest percentage among all of the choices. This might seem like good news for the world’s biggest companies, because that’s where they still spend the bulk of their advertising dollars. But dig a little deeper, and you’ll see the desire for experience staring us right in the face. ... There are other ways you can balance your focus on ROX versus ROI, including with your physical retail space. Retailers, banks, and auto dealerships invest enormous amounts of resources and time in their stores, branches, and showrooms. Yet, in today’s harried world, where so much product research is done online by consumers, creating the best customer experience is often more about getting patrons in and out as quickly and efficiently as possible.


Data: if it’s the next oil, is it renewable or toxic?

Data: if it’s the next oil, is it renewable or toxic? image
It all boils down to privacy. Data has the potential to support the discovery of new medical treatments. It could transform healthcare for the better — and it is hard to find anyone who would not be in favour of that. But at what price? Regulators seem to have decided that in some cases the price is too high. ... The EU’s GDPR and other privacy regulations being rolled out across the world in countries like Canada, Japan and Brazil are an attempt to ensure we get the benefits of data without the penalty of lack of privacy. But GDPR does not always work. How often do you throw your hands up in frustration because you have to read and agree/disagree with privacy policies and opt-in requests, just to get a tiny piece of information? It sometimes takes longer to read the disclaimers and other compliance inspired literature, than get the actual information you need. According to Sarah Burnett, Executive Vice President and Distinguished Analyst at Everest Group: “Organisations are confusing their ability to share data internally between departments.”


Teen becomes first millionaire through HackerOne bug bounties


According to bug bounty pioneer and CEO of Luta Security, Katie Moussouris, although targeted bug bounties have a role to play in cyber security, they are not a “silver bullet”, and run the risk of wiping out talent pipelines if poorly implemented, by providing incentives for people with cyber security skills to work outside organisations in pursuit of bounties. Lopez said he was proud to see his work recognised and valued. “To me, this achievement represents that companies and the people that trust them are becoming more secure than they were before, and that is incredible. This is what motivates me to continue to push myself and inspires me to get my hacking to the next level,” he said. Lopez is a top-ranked hacker on HackerOne’s leaderboard, out of more than 330,000 hackers competing for the top spot. His specialty is finding insecure direct object reference (IDOR) vulnerabilities.


The big picture: Is IoT in the enterprise about making money or saving money?

Is IoT in the enterprise about making money or saving money?
Basically, we’ve got about a third of companies hoping to save some money, another third looking for new revenue from increased production, monetizing data or creating product-as-a-service offerings, and the last third expecting a little of both. That couldmean that the IoT offers something for everyone, solving whatever problems a company might face, which is how Seth Robinson, senior director for technology analysis at CompTIA pitched the results in a statement: “This recognition that IoT is not simply a tool for cost savings, but a potential source of new revenue, is mirrored by our finding that for a majority of companies, funding for IoT projects often comes from places other than the IT department. This demonstrates not only the importance of IoT to future strategy, but the company-wide impact IoT tends to have.” In other words, the IoT fights crime and cures cancer! It’s a deodorant that doubles as a floor wax!



Quote for the day:


"Many people read History books but it takes just a few people to LEAD the cause that will shape the course of HISTORY." -- Fela Durotoye


Daily Tech Digest - February 28, 2019

istock-808157766-1.jpg
Risk Based Security, the private operator of competing database VulnDB, aired their grievances with the public CVE/NVD system in their 2018 Vulnerability Trends report, released Wednesday, with charged conclusions including "there is fertile grounds for attorneys and regulators to argue negligence if CVE/NVD is the only source of vulnerability intelligence being used by your organization," and "organizations are getting late and at times unreliable vulnerability information from these two sources, along with significant gaps in coverage." This criticism is neither imaginative, nor unexpected from a privately-owned competitor attempting to justify their product. In fairness to Risk Based Security, there is a known time delay in CVSS scoring, though they overstate the severity of the problem, as an (empirical) research report finds that "there is no reason to suspect that information for severe vulnerabilities would tend to arrive later (or earlier) than information for mundane vulnerabilities."



Will Digital Banking and Cashless Economies Lead to Chaos?


Digital banking systems may be ‘convenient,’ but there is little doubt that they often fail, with the consequences of a failure being significant. On June 1, 2018, shoppers in the United Kingdom were left stranded, unable to make purchases with their Visa cards. The outage, which lasted for several hours, caused significant disruption and exemplified the problems of monopolized reliance on digital infrastructure. In another example, TSB, a leading British retail and commercial bank, recently faced scrutiny for its mishandling of the migration of its digital infrastructure, that left thousands of customers unable to access their online and mobile banking accounts for up to five days. According to the Financial Conduct Authority, financial institutions in the United Kingdom have reported a 138 percent increase in technology outages and an 18 percent increase in “cyber incidents” this year to date. ... As transactions move online, the amount of data available about one’s finances and purchasing habits increases. Does the current digital infrastructure have appropriate safeguards to protect against data breaches?



Most AI developers are now ultimately directed towards achieving a basic goal. They are charged with the responsibility of building AI models that would aptly substitute direct human efforts. This need comes in recognition to the inadequacies of human labor efforts, which are characterized by inaccuracy, inefficiency and other failures. For example, artificial intelligence has been pointed at to possess the potential for more accurate medical practices. Thus, you can be sure of a more accurate surgical procedure using this framework than is currently available by most humans. Hence, we can say that the opposites of the inadequacies of human efforts are precisely the benefits of artificial intelligence to our world. However, even though work is ongoing in significantly constructing the usefulness of this technology, truly significant achievements are yet to come. AI is all around us, but often times we don’t notice it. For instance, Facebook uses AI technology for its image recognition.



The FTC Probably Doesn't Need A New 'Big Tech' Task Force. It Just Needs To Do Its Job

While there's certainly a lot of solid complaints to be made about "big tech" giants like Facebook and Google (especially on the privacy front), it's also pretty obvious that a lot of the recent criticisms of "big tech" aren't being made in good faith. Claims of "censorship" of conservatives, for example, usually don't hold up under scrutiny, and are often driven by folks who wouldn't be facing these problems if they didn't behave like legendary assholes on the internet in the first place. Similarly a lot of the recent criticism of big tech is coming from telecom giants eager to saddle Silicon Valley giants with unnecessary regulation in a bid to hoover up a bigger slice of the online advertising pie. On the one hand, telecom giants like AT&T and Verizon just got done convincing the FCC to effectively neuter itself, leaving any remaining oversight in the lap of an FTC they know won't (and often can't) hold them accountable.


How To-Do is integrating with more and more of the Microsoft ecosystem


To-Do integration is much simpler than these older ways of connecting tasks to Outlook. Sign in to both Outlook (or Outlook.com) and To-Do with the same account and tasks and the lists you organise them into will just sync between the two tools. (If it's a Microsoft account, it has to use outlook.com, not a Yahoo or Gmail email address.) You can create tasks and mark them as complete in either app, and drag tasks from one list to another in either app. Even the emoji you can use in list names to customise the To-Do icons appear in Outlook. Under the covers, To-Do is rather like a specialised viewer for Outlook and Exchange tasks, although it doesn't support all the Outlook task features. You can only have one due date, rather than separate start and end dates; task statuses like in-progress or 25 percent complete, and details like mileage won't show up in To-Do; and you can't set task work hours, different priority levels or assign an Outlook category.


A government perspective: Tech Trends 2019

Globe
Many public organizations are finding that each individual advancement in technology—for example, blockchain, or digital reality, or serverless architecture—is powerful, but that the real power emerges when they combine. Finding jobs that new technologies can do is a first-level challenge. Finding ways to integrate a constellation of new technologies into a new operational paradigm is the next-level challenge that’s unfolding right now. Public-sector organizations have much to learn from each other. They can draw useful lessons from their counterparts in private enterprise, and indeed from other nations. Each agency is on a path toward greater digital adoption, but they’re at different places on that journey. What do they have in common? A commitment to mission-driven service.


It’s time to start some serious research into the ethics of AI


There was a general view among panellists that the need for more AI ethics research should not be read as a need for more regulation. Elisabeth Ling, managing director of researcher products and research metrics at Elsevier, said that among members of the European Commission’s high-level expert group for AI - of which she is a member - the ethics debate is, “hard and hot.” However, “There seems to be a consensus that jumping to regulation would not be the right thing to do,” she said. “We already have quite strong laws in place in Europe.” It is important to distinguish between regulating algorithms and regulating the way they are used, said Nick Jennings, vice provost for research and enterprise at Imperial College London. In the former case, “I can’t think of a sensible way in which that would make sense,” he said. But, “when [algorithms have] been trained and have data inside them and they can make decisions and they’re being used in a given application, then it is a different business.”


Why the industrial IoT is more important than consumer devices

8 surprising IoT trends to watch for in 2019
“The edge is basically any place — a wind farm, a factory — where data is generated, analyzed, and largely stored locally,” Nelson said. “Wait? Isn’t that just a data center? Sort of. The difference is the Internet of Things.” His point is that most of the vast amounts of data that is machine-generated doesn’t need to go very far. “The people who want it and use it are generally in the same building,” he noted, quoting Gartner’s prediction that more than 50 percent of data will be generated and processed outside traditional data centers — on the edge — although “snapshots and summaries might go to the cloud for deep analytics.” But Nelson wasn’t sure about what kind of edge architectures would prevail. The edge might function like an interim way station for the cloud, he noted, or we could see the emergence of “Zone” networking — edges within edges — that can conduct their own analytics and perform other tasks on a smaller, more efficient scale.


VMware offers pure open-source Kubernetes, no chaser


Unless you've been hiding under a rock in the IT world, you know Kubernetes, the container orchestration program of choice, is hotter than hot. Everyone's using it, adding on to it, offering it as a service, the list goes on and on. But VMware wants you to know that, if all you want is Kubernetes without all the fancy trimmings, well, it can give you that, too, with VMware Essential PKS. PKS includes upstream Kubernetes; reference architectures to inform design decisions; and expert support to proactively guide you through upgrades or maintenance and help you troubleshoot it if you need a hand. That's all. That's it. If that sounds familiar, well it should. Last November, VMware acquired Heptio. This company, which was founded by two Kubernetes creators, Joe Beda and Craig McLuckie, used essentially this business model. Indeed, you could argue, that VMware Essential PKS is just a new coat of pain on Heptio's previous offerings.


Monitoring and Managing Workflows Across Collaborating Microservices


In its essence, orchestration for me means that one service can command another to do something. That’s it.That’s not tighter coupling, it is just coupled the other way round. Take the order example. It might be a good idea that the checkout service just emits an order placed event but does not know who processes it. The order service listens to that order placed event. The receiver knows about the event and decides to do something about it; the coupling is on the receiving side.  It is different with the payment, because it would be quite unnatural that the payment service knows what the payment is for. But it would need that knowledge in order to react on the right events, like order placed or order created. This also means it has to be changed whenever you want to receive payments for new products or services. Many projects work around this unfavorable coupling by issuing payment required events, but these are not events, as the sender wants somebody else to do something.



Quote for the day:


He who cannot be a good follower cannot be a good leader. - Aristotle


Daily Tech Digest - February 27, 2019

Debunking five myths about process automation

Debunking five myths about process automation image
Software solutions can be costly, nearly clearing out entire IT budgets in one fell swoop and running up tabs for maintenance and service fees down the road. Often the change in productivity is not near enough to compensate for the forfeited capital, leaving management with a disappointing return on investment. Thankfully, not all systems carry a hefty price tag. Cloud-based apps have disrupted the pricing scale with rates so low even small non-profit organizations can automate workflows without breaking the budget. Saas models and cloud storage enable low, flexible monthly rates and only a small one-time start-up fee. ... Automation systems offering no-code platforms solve multiple problems at once. Since data is securely stored on the cloud, server space isn’t required. The tedious responsibilities of creating new versions, fixing bugs, and maintaining the software fall on the service provider and don’t incur additional expenses for the user; it’s all neatly packed into the monthly rate.


How to move to a disruptive network technology with minimal disruption

rotating square frames around a bridge path / future / change / transformation
Start with the open source and open specification projects, suggests Amy Wheelus, network cloud vice president at AT&T. For the cloud infrastructure, the go-to open source project is OpenStack, with many operators and different use cases, including at the edge. For the service orchestration layer, ONAP is the largest project in Open Source, she notes. "At AT&T we have launched our mobile 5G network using several open source software components, OpenStack, Airship and ONAP." Weyrick recommends "canarying" traffic before relying on it in production. "Bringing up a new, unused private subnet on existing production servers alongside existing interfaces and transitioning less-critical traffic, such as operational metrics, is one method," he says. "This allows you to get experience deploying and operating the various components of the SDN, prove operational reliability and gain confidence as you increase the percentage of traffic being transited by the new stack."


Wearable technology in the workplace and data protection law


Wearable technology is not always quite as extreme, with many employees reaping the benefits of fitness bands and smart watches. Wearable technology can also be used to help keep employees safe. For example, Oxfordshire County Council recently announced that waste recycling teams will be fitted with body cameras to deter physical and verbal abuse from the public. Whatever the technology, there will always be arguments for and against the introduction of workplace accessories, with the importance of wellbeing, safety and productivity, balanced against the adverse costs, legitimate privacy concerns, risks of discrimination and potential staff morale issues. However, given the breadth of personal data the technologies are likely to obtain, and the real risk of over-collection or that the data is used for an illegitimate purpose, the biggest adversary for wearable technology in the workplace is likely to be data protection law.


What is BDD and What Does it Mean for Testers?

The BDD approach favors system testability. I dare to say that this scheme works better with microservice architecture than with monolithic systems because the former allows adding and working on all layers independently per feature. This scheme facilitates testing as we think about the test before any line of code, thus providing greater testability. This reminds me of the Scrum Master course I took by Peregrinus, where the instructor mentioned that in Agile, the important thing is to know how to "cut the cake." Think of a multi-layered cake. Each layer adds a new flavor that is to be combined with the other layers. If the cake is cut horizontally, some people may only get the crust, some a chocolate middle layer, another a layer of vanilla, or a mix of the two, etc. In this scenario, no one would get to taste the cake as it was fully meant to be enjoyed and no one could say they actually tasted the cake.


The danger of having a cloud-native IT policy

The danger of having a cloud-native IT policy
The core benefit is simplicity. Because you’re using only the native interfaces from a single cloud provider, there is no heterogenous complexity to worry about for your security system, database, compute platforms, and so on; they are uniform and well-integrated because they sprung from the same R&D group at a single provider. As a result the cloud services work well together. A cloud-native policy limits the cloud services you can use, in return for faster, easier deployment. The promise is better performance, better integration, and a single throat to choke when things go wrong. The downside is pretty obvious: Cloud-native means lockin, and while some lockin is unavoidable, the more you’re using cloud-native interfaces and APIs, the more you’re married to that specific provider. They got ya. In an era where IT is trying to get off of expensive, proprietary enterprise databases, platforms, and software systems, the lockin aspect of cloud-native computing may be an understandably tough sell in most IT departments.


5G use cases for the enterprise: Better productivity and new business models


With 5G potentially allowing much faster download speeds and lower latency for users, he argued that a smartphone will have the same — if not better — potential for productivity as a PC. "With 5G, if you want to run something that requires a lot of processing power that's traditionally only on PCs, you'll be able to run that using the edge capability of the operator on 5G. So all of a sudden, your smartphone starts to become a more powerful platform for productivity," Amon said. Qualcomm's enthusiasm for 5G was unsurprisingly echoed by Huawei. "With a super-fast connection and with low latency, we could put a lot of things, heavy things in the cloud — in our hand," said Li Changzhu, VP of the handset product line at Huawei. The Chinese telecommunications company has used MWC to showcase its brand new Huawei Mate X foldable 5G smartphone. With a large fold-out screen and 5G connectivity, Huawei is positioning it at least in part as an enterprise productivity device.


Why AI and ML are not cybersecurity solutions--yet

istock-905574028ai.jpg
Strong AI, capable of learning and solving virtually any set of diverse problems akin to an average human does not exist yet, and it is unlikely to emerge within the next decade. Frequently, when someone says AI, they mean Machine Learning. The latter can be very helpful for what we call intelligent automation - a reduction of human labor without loss of quality or reliability of the optimized process. However, the more complicated a process is the more expensive and time-consuming it is to deploy a tenable ML technology to automate it. Often, ML systems merely assist professionals by taking care of routine and trivial tasks and empowering people to concentrate on more sophisticated tasks. ... Although application security best practice has been discussed for years, there are still regular horror stories in the media, often due to a failure in basic security measures. Why are the basics still not being followed by significant numbers of businesses?


Microsoft boosts HoloLens performance, seeks corporate users


Speaking during the product’s launch, Microsoft Chief Executive Officer Satya Nadella said Microsoft had a responsibility to be a “trusted partner” for companies using its products, such as HoloLens, and that businesses and institutions shouldn’t be dependent on the tech giant. “The defining technologies of our times, AI and mixed reality, cannot be the province of a few people, a few companies or even few countries,” he said. “They must be democratized so everyone can benefit.” Unlike virtual reality goggles, which block out a user’s surroundings, the augmented-reality HoloLens overlays holograms on a user’s existing environment, letting them see things like digital instructions on complex equipment. Microsoft is focusing on corporate customers with HoloLens, and is trying to make the devices more useful right out of the box with prepared applications, rather than require months to write customized programs, says spokesman Greg Sullivan.


Security is battling to keep pace with cloud adoption


The survey found that enterprises are inadvertently introducing complexity into their environments by deploying multiple systems on-premise as well as across multiple private and public clouds. That complexity is compounded by a lack of integrated tools and training that are needed to manage and secure hybrid cloud environments. Respondents also cited a lack of integration across tools, and shortage of qualified personnel or insufficient training for using the tools, as key roadblocks to achieving cross-environment security management. While 59% of respondents said they use two or more different firewalls in their environment and 67% said they are also using two or more public cloud platforms, only 28% said they are using tools that can work across multiple environments to manage network security. 


Western Digital launches SSDs for different enterprise use cases

Western Digital launches SSDs for different enterprise use cases
The SN630 is a read-intensive drive capable of two disk writes per day, which means it has the performance to write the full capacity of the drive two times per day. So a 1TB version can write up two 2TB per day. But these drives are smaller capacity, as WD traded capacity for endurance. The SN720 is a boot device optimized for edge servers and hyperscale cloud with a lot more write performance. Random write is 10x the speed of the SN630 and is optimized for fast sequential writes. Both use NVMe, which is predicted to replace the ageing SATA interface. SATA was first developed around the turn of the century as a replacement for the IDE interface and has its legacy in hard disk technology. It is the single biggest bottleneck in SSD performance. NVMe uses the much faster PCI Express protocol, which is much more parallel and has better error recovery. Rather than squeeze any more life out of SATA, the industry is moving to NVMe in a big way at the expense of SATA. IDC predicts SATA product sales peaked in 2017 at around $5 billion and are headed to $1 billion by 2022.



Quote for the day:


"It's not the size of the dog in the fight, it's the size of the fight in the dog." -- Mark Twain


Daily Tech Digest - February 26, 2019

Are Cloud Portability and Interoperabilty Even Possible?

Are Cloud Portability and Interoperabilty Even Possible?
Interoperability, or the ability for computers to share information with each other, is the basis of cloud portability. Data and applications might be created with a specific operating system or runtime in mind. That makes it tricky to move that information to a different environment without any problems. Even if the functions of a system are the same, the foundation it’s based upon is crucial for application stability. Because it’s fundamental for cloud portability to work, cloud providers need to support interoperability between each other’s systems. Unfortunately, this rarely happens fro two major reasons: native features and the lack of standards. Just like any other kind of service, cloud providers present native features in their services. These are features that the provider specializes in, or that no other provider offers. They go beyond the basics of cloud computing and are designed to give developers additional resources for their applications.


How Cloud Computing Is Changing Schools And WorkPlaces
Technology can replace additional employee costs, minimize geographical differences and help project a professional image. Technology enhancements may require employee training to ensure that new technology devices are used correctly and seamlessly integrated into everyday business processes. Use of Evernote and Dropbox to maintain access to files and share files with anyone from anywhere without a physical presence somewhere. Integrating technology into the workplace offering increased productivity is important for staying competitive. Over the past five years, technology has rapidly changed and developed in every conceivable field. Smart phones are now able to act as standalone computer devices that can take pictures, the internet, send or receive e-mails and text messages and of course they even make phone calls. Instead of having to wait one week for files to be sent by mail, information can be transferred instantly via email or file sharing programs. Technology has made the world much smaller, especially in the business context. People from different cultures interact regularly.


Europe is prepared to rule over 5G cybersecurity

The theme of this year’s show is “intelligent connectivity”; the notion that the incoming 5G networks will not only create links between people and (many, many more) things but understand the connections they’re making at a greater depth and resolution than has been possible before, leveraging the big data generated by many more connections to power automated decision-making in near real time, with low latency another touted 5G benefit (as well as many more connections per cell). Futuristic scenarios being floated include connected cars neatly pulling to the sides of the road ahead of an ambulance rushing a patient to hospital — or indeed medical operations being aided and even directed remotely in real-time via 5G networks supporting high resolution real-time video streaming. But for every touted benefit there are easy to envisage risks to network technology that’s being designed to connect everything all of the time


IT’s Vital Role in Multi-cloud, Hybrid IT Procurement

istock 478007698cr
Dillingham suggests that organizations consider all types of deployments in terms of costs. Large, existing investments in data center infrastructure will continue to serve a vital interest, yet many types of cloud deployments will also thrive. And all workloads will need cost optimization, security, compliance, auditability, and customization. He also recommends businesses seek out consultants to avoid traps and pitfalls, which will help better manage their expectations and goals. Outside expertise is extremely valuable not only with customers in the same industry, but also across industries. “The best insights will come from knowing what it looks like to triage application portfolios, what migrations you want across cloud infrastructures, and the proper set up of comprehensive governance, control processes, and education structures,” explains Dillingham. Gardner added that systems integrators, in addition to some vendors, are going to help organizations make the transition from traditional IT procurement to everything-as-a service.


Are Frameworks Good or Bad, or Both?

A library is defined by Van Buul as a body of code, consisting of classes and functions, which is used by an application, but without being part of that application. An application interacts with the library by doing function or method calls into the library. He defines a framework as a special kind of library where interaction is the other way around, an application now implements interfaces in the framework, or use annotations from it. During execution the framework invokes application code; using a library it’s the other way around. Creating an application without using a framework is for Van Buul somewhat of a mirage, claiming that even if you just use the Java platform, you are also using a framework. He points out that the Java platform is an abstraction over the operating system and the machine and that the platform is invoking the application code. He also notes that most business applications have a web-based interface and use an abstraction layer to create entry points into the application — meaning a framework is used.


The end of Blu-ray


Why? It's all because of streaming. The numbers speak for themselves. According to the Digital Entertainment Group 2018 home entertainment report, we spent more than ever on video last year, $23.3 billion, up 11.5 percent from 2017. Of that, subscription streaming -- led by Netflix, Amazon Prime Video, and Hulu -- took the lion's share, with a 30 percent year-over-year rise to $12.9 billion. We also bought and rented another $4.55 billion worth of online movies and TV shows. Blu-ray? Even with the growing popularity of 4K Blu-ray, movie, and TV show sales only came to $4.03 billion. That's a 14.6-percent drop from 2017. At the same time, the far less expensive streaming-devices sales from companies like Rokuare growing. By NPD's count, streaming players sales from 33.3 million in November 2015 to 67.8 million in November 2018. A recent Deloitte study shows that in the past 10 years the number of households subscribing to paid streaming-video services has grown by nearly 500 percent.


Shopping for AI talent? Beware of unicorns


When executives woke up to the potential of big data, it was also at the same time we were dealing with financial collapse. Balancing between the data economy and keeping regulators at bay created a new unicorn, the chief data officer. Digital has had its own unicorn story. Adopting technology to automate, augment, and scale businesses is hard. Usher in the chief digital officer. Today, AI talent challenges inside enterprises are causing firms to assess their approach to data science while also recognizing deep data deficiencies and the impact on DevOps. The new unicorn? The AI engineer. ... Too many skills and experiences are expected from a single person. In an emerging market, to expect savants are available or findable is asking a lot. Even the digital disruptors — Amazon, Facebook, Google, or Tesla — know these roles are mythical. An early MVP for each wasn’t a market viable product; it was a proof of concept (POC). There was enough there to tell a story of what could come and what could be achieved.


Dramatic changes ahead for workplace, workforce and technology by 2025

Dramatic changes ahead for workplace, workforce and technology by 2025 — global CIOs confirm image
The world of work is changing, rapidly. You don’t have to cast your mind back that far to when the World Wide Web became publicly available on 6 August 1991 — but who could have predicted the change and transformation it would herald? The internet’s eruption has catalysed the rapid change of both work and society, the business and the consumer. In this constantly morphing world we find ourselves in the workforce, workplace and the technologies that support them will be so different by 2025 ‘that enterprises need to provide global access and ensure continuous uptime now,’ according to research carried out by One Login. To remain agile and relevant, enterprises must start addressing global digital transformation strategies, including unified access management. Who says? Well, the majority of 100 CIOs from companies with at least 5,000 employees.


There's a disconnect between business and security teams

Cyber risk management: The disconnect between business, security teams
39% say they want security status reports related to major business and IT initiatives. In other words, they want to understand cyber risk as it relates to end-to-end business processes, not details about Windows PCs, DNS servers, or software vulnerabilities. Cybersecurity teams need to do a better job translating geeky data into business metrics. 36% say they want to know about the status and response associated with IT audits. This isn’t a new requirement, but business people want more than intermittent reviews; they want frequent updates that help guide timely risk mitigation decisions. To satisfy this need, CISOs must strive for continuous risk management analysis. 36% say they want reports related to vulnerabilities in their environment correlated with other data. Yes, business people care about vulnerable assets, but they don’t want to see reports detailing software vulnerabilities across thousands of systems. Rather, they want to understand if mission-critical assets are vulnerable to known exploits in the wild, so they can prioritize mitigation actions such as patching systems, segmenting traffic, and restricting access.


7 ways the new California privacy law will impact all organizations

Many/most of us have received emails or letters in the past from large companies saying that they had experienced an “unauthorized breach and your data may have been accessed and stolen.” The company further says not to worry, they are providing you with one or two years worth of free credit monitoring – and you’re welcome!” Now, CA residents can immediately bring an action against the company and be awarded damages without needing to prove actual damages. And let’s not forget that this law will be a huge opportunity for attorneys filing class action lawsuits. AB 375 raises the bar for much higher security for companies collecting or in possession of California resident data. The law also will force companies to be more aware of the consumer data they are collecting and manage that data more granularly. And preparing for the new California law (as well as the just-released GDPR) will be more complicated as other states look at adopting their own privacy laws.



Quote for the day:


"Make your mistakes, take your chances, look silly, but keep on going. Don’t freeze up." -- Thomas Wolfe"