Daily Tech Digest - January 03, 2020

2020: The year the office finds its voice?

alexa virtual assistant echo amazon alexa voice control
While conversational AI tools such as chatbots are now common, voice interfaces have been slower to arrive, according to Hayley Sutherland, a senior research analyst at IDC. But advances in the underlying natural language processing technology has made voice-based assistants accurate enough to support regular interactions. “We've seen huge leaps in natural language processing, even in the last year,” she said. That’s important because it means the assistants are less likely to misunderstand commands, which can quickly annoy users. “If I'm working with a voice assistant and it works 80% of the time, that remaining 20% is a lot in my day-to-day job; that can add up to a lot,” she said. Although advances in natural language processing (NLP) usually come from big tech companies like Microsoft, Amazon and Google with deep pockets for research and development, the availability of voice APIs gives more companies access to the technology. And those firms can create AI assistants better tailored to specific workplace scenarios.



Using the Visitor Pattern to Maintain MVVM Layering While Implementing Dialogs in WPF

The Visitor Pattern separates data from the operations to be performed on it. In this case, a dialog form needs to update or populate the fields of an object. The Visitor achieves this by having a class with overloaded methods, each accepting a specific object type (class) argument. Thus, when a "Visit" method is invoked with a specific argument type, the correct method is automatically chosen. These Visit methods are responsible for creating the correct dialog window, assigning the argument object as the DataContext of the dialog, and showing the dialog (we're assuming a modal dialog here). The Visitor is injected into the ViewModel (VM) objects in the MainWindow-loaded event handler by property injection (the ViewModel object(s) have a public Visitor property field to hold the reference to the Visitor). I believe this is simpler than using a mediator, since there is no need for events to pass between the layers. This example does not require a Dependency Injection container, although one could be applied with little difficulty. It does not reference Prism Behaviors or other external frameworks. It can be added to an existing code base with no disruption.


Ready, Set ... Stop: While Tech Speeds Audits, Regulators Slow Down Process

Automation Technology 5558c5b706074
Unfortunately, the financial accounting standards board and other regulatory bodies have not yet addressed the implications of these technologies. Blockchain is widely associated with Bitcoin and cryptocurrencies. But blockchain will transform auditing, because blockchain by its nature is an ecosystem of an incredibly secure transactions. Even now, startups and large financial services companies are developing solutions to makeover old school industries, like gas and oil, changing when and how their accounting gets done. If we step back and see what is going on with blockchain, AI, and machine learning, it is quite probable that accounting will be dramatically altered in our lifetimes. As it stands, these innovations remain ahead of the standard-setting process. When and how these bodies can address these advancements remains to be seen. But we are seeing technology disrupt allied fields, like logistics and supply chain management. Our standards bodies have no choice but to get ahead of the story—before the story writes its own plot.


Cloudian CEO: AI, IoT drive demand for edge storage


One is that they continue to just need lower-cost, easier to manage and highly scalable solutions. That's why people are shifting to cloud and looking at either public or hybrid/private. Related to that point is I think we're seeing a Cloud 2.0, where a lot of companies now realize the public cloud is not the be-all, end-all and it's not going to solve all their problems. They look at a combination of cloud-native technologies and use the different tools available wisely. I think there's the broad brush of people needing scalable solutions and lower costs -- and that will probably always be there -- but the undertone is people getting smarter about private and hybrid. Point number two is around data protection. We're now seeing more and more customers worried about ransomware. They're keeping backups for longer and longer and there is a strong need for write-once compliant storage.


Do containers need backup?

blockchain tradelens supply chain
In one sense, a typical container does not need to have its running state backed up; it is not unique enough to warrant such an operation. Furthermore, most containers are stateless – there is no data stored in the container. It’s just another running instance of a given container image that is already saved via some other operation. Many container advocates are quick to point out that high availability is built into every part of the container infrastructure. Kubernetes is always run in a cluster. Containers are always spawned and killed off as needed. Unfortunately, many confuse this high availability with the ability to recover from a disaster. To change the conversation, ask someone how they would replicate their entire Kubernetes and Docker environment should something take out their entire cluster, container nodes and associated persistent storage. Yes, there are reasons Kubernetes, Docker and associated applications need to backed up. First, to recover from disasters. What do you do if the worst happens? Second, to replicate the environment as when moving from a test/dev environment to production, or from production to staging before an upgrade.


Your excess server resources are wanted in the cloud

Your excess server resources are wanted in the cloud
Although not a new concept, we are now looking at the opportunity for those who have private servers with excess capacity to rent that capacity to a cloud service provider that can dole out those compute and storage systems on demand to anyone who needs them. If you’re thinking ride-sharing for servers, you’re not far off. In this scenario the cloud service provider is really just a broker sitting between those needing cloud services and those who have servers that can be shared. You may be leveraging servers that have excess capacity in Las Vegas on Monday and perhaps servers in London on Tuesday. You don’t care since you’re abstracted away from the physical servers, not even knowing location and true ownership. Peer-to-peer networks are nothing new. Indeed in this use case there is a clear benefit for both parties. Those with excess server capacity will make money by renting it, thus there is a revenue stream for server capacity that would normally go unused. Those consuming this service would likely pay less money than they would for most public cloud services, at least it would seem, living up to SLAs preset by the consumers.



10 top distributed apps (dApps) for blockchain

blockchain crypotocurrency bitcoin by olgasalt getty
"DApps will pool resources across numerous machines globally," said Juniper senior analyst Lauren Foye. "The results are applications which do not belong to a sole entity, [but] rather are community-driven." Bitcoin was arguably the first dApp, enabling anyone in the world to download a bit of open-source code to join a blockchain network and verify transactions using a “mining” algorithm, thereby generating digital currency (cryptocurrency) as a reward. Like a RAIDed storage array, if one of the computers (or nodes) running the dApp software goes down, another node instantaneously resumes the task. Because smart contracts, or self-executing business automation software, can interact with dApps, they're able to remove administrative overhead, making them one of most attractive features associated with blockchain. While blockchain acts as an immutable electronic ledger, confirming that transactions have taken place, smart contracts execute predetermined conditions; think about a smart contract as a computer executing on "if/then," or conditional, programming.


Father Prototype and Mother Constructor

Most people get acquainted with JavaScript like "well, it's an object oriented, dynamically typed, programming language", at least I was. Learning more and more of it, became obvious that the strength of the language lies mostly on its powerful functions. On its first-class functions. Prototypes are one of the most fundamental JavaScript features. If I start this article with objects, there will be trouble, if I start with functions, there will be double. Either way, I'll soon get into the "Chicken-and-egg" situation. I'll write some examples and point to some facts. Let's discover the rules of the prototype and build upon the constructors. Something tells me it's best to start, like everybody was starting with this language. You create an object... But, what's an object? More general on that at Appendix C. For JavaScript, "objects represent collection of named values. The named values, in JavaScript objects, are called properties. Object properties can be both primitive values, other objects, and functions. An object method is an object property containing a function definition."


Open source storage: driving intelligence in the small data sprawl era

Open source storage: driving intelligence in the small data sprawl era image
Open source, increasingly, is influencing in analytics space. The analytics space has evolved beyond things like Hadoop and MapReduce, which were very text oriented and big data lake centric, to this understanding that the world is shifting to what is termed small data sprawl. The proliferation of IoT, remote sites and offices, means that organisations want to process or analyse data remotely, while enriching that data with information from the centre. With this change there have been much more vertical offerings that are integrating the analytics with the storage itself. Manley explained: “Somebody doesn’t just want to store data for IoT. The point of IoT is that I’m processing and analysing, and we’re seeing a lot more integrated pipelines, of which storage becomes a component. And open source is by far the most popular way, whether you look at Spark or Elasticsearch, because they can evolve quickly and people can adjust them to meet the specific needs of their particular industry.”


CSS Architecture for Component-Based Applications

CSS architecture is a complex subject that is often overlooked by developers, as it's possible to encapsulate CSS per component and avoid many of the common pitfalls that relate to CSS. While this 'workaround' can make the lives of developers simpler, it does so at the cost of reusability and extendibility. When a developer defines a CSS class, it automatically affects the global scope modifying all related elements (and their children). This works great for simple applications where developers can predict the results, but can quickly become a problem when the size of the application and the team grows, and unintended results start to happen. Initially, this problem was solved by Block Element Modifier (BEM), which is a methodology and set of naming conventions that helped avoid clashes and gave developers strong indications as to what each class did e.g. form__submit--disabled tells us we are within a form, handling a submit button, and applying the disabled state.



Quote for the day:


"No organization should be allowed near disaster unless they are willing to cooperate with some level of established leadership." -- Irwin Redlener


Daily Tech Digest - January 02, 2020

Seoul to install AI cameras for crime detection

cctv-camera-istock.jpg
The cameras will automatically measure whether somebody is walking normally or tailing someone. It will also detect what passersby are wearing -- such as hats, masks, or glasses -- and what they are carrying with them such as bags or dangerous objects that have a strong possibility of being used to commit a crime. The cameras will also consider whether it is day or night. They will use this information to deduce the probability that a crime will take place, they claim. If the rate exceeds a certain rate, the cameras will alert the district office and nearby police stations to send personnel to the location. Going forward, Seocho and ETRI plan to analyse 20,000 court sentencing documents and crime footage to deduce crime patterns for the AI software to memorise. The cameras will be able to compare whether what is being filmed at the present matches past crime patterns.  "It will work like deja vu," said an ETRI spokesperson.  The AI software is still in development and the complete version will be finished by 2022, the institute said. Cameras with its capabilities will eventually be expanded to other districts in Seoul as well as other provinces, they added.



DevOps Ten Years Later: We Still Have Work to Do

The rapid pace of DevOps and agile release cycles often introduce more security bugs than the slower, siloed approaches they replace. Adding application security teams into the DevOps process may increase the learning curves/pains as most developers have little-to-no experience in application security (vulnerabilities, remediations, etc), but the end result will be fewer security issues. Any application that is "pre-DevOps" or is a third-party app gains zero benefits from DevOps. In most large enterprises, so-called "brown-field" apps comprise approximately 80% of all apps, which means there's a big burning issue of how to manage pre-DevOps/third-party apps. Runtime-based solutions including RASP bring rapid-update/remediation benefits to these classes of apps in a DevOps-like way. The compiler-based technology that Waratek has perfected allows patching, adding security rules, and even upgrading out-of-public support Java platforms in minutes, not months (or years). This eliminates the need for source code changes, production downtime, profiling, tuning, and the use of heuristics along with a lot of needless cost and performance issues.


A CISO's Security Predictions for 2020

A CISO's Security Predictions for 2020
The combination of AI and GAN technologies and the flaws inherent in all of the current technologies that leverage facial recognition to unlock smart phones, verify passport IDs and identify criminals on the street presents a rapidly growing threat, which cybercriminals will look to exploit. Extortionary deepfakes will be used to portray highly realistic videos of executives in compromising positions alongside ransomware demands tied to the threat of public domain release. Propagandized deepfakes will abound throughout the 2020 election cycle and be leveraged to discredit candidates and propel misrepresentations of truth (lies) to micro-targeted segments of voters via social media. Audio and video deepfakes will enhance the credibility of business email compromise attacks and lend an even more convincing air of authenticity to money transfer requests. And it won't take a hacking genius to pull these off. In fact, anyone can leverage AI to build convincing deepfakes without expertise in technology. Machine-learning websites available today can accept uploaded audio and videos and return deepfakes based on specific scripts.


How classroom technology is holding students back

Some studies have found positive effects, at least from moderate amounts of computer use, especially in math. But much of the data shows a negative impact at a range of grade levels. A study of millions of high school students in the 36 member countries of the Organisation for Economic Co-operation and Development (OECD) found that those who used computers heavily at school “do a lot worse in most learning outcomes, even after accounting for social background and student demographics.” According to other studies, college students in the US who used laptops or digital devices in their classes did worse on exams. Eighth graders who took Algebra I online did much worse than those who took the course in person. And fourth graders who used tablets in all or almost all their classes had, on average, reading scores 14 points lower than those who never used them—a differential equivalent to an entire grade level. In some states, the gap was significantly larger.


Bosch debuts long-range lidar sensor for autonomous vehicles

Bosch
Like all lidar sensors, Bosch’s solution measures the distance to target objects by illuminating them with laser light and measuring the reflected pulses. It’s intended for close and medium ranges on highways and in cities, and the company claims it’ll be price-competitive with rivals, thanks to economies of scale. “By filling the sensor gap, Bosch is making automated driving a viable possibility in the first place,” said Bosch management board member Harald Kroeger in a statement. “We want to make automated driving safe, convenient, and fascinating. In this way, we will be making a decisive contribution to the mobility of the future.” The lidar sensor will slot alongside the six-antennae LRR4 radar in Bosch’s ever-expanding perception portfolio. The LRR4 features a detection range of 250 meters and can recognize up to 24 objects simultaneously. But, like all radar sensors, it’s less angularly accurate than lidar as it loses sight of objects on curves. And it becomes confused if multiple vehicles are placed close to each other. 


Big Data, Health Informatics, and the Future of Medicine


The ongoing convergence of emerging technologies in cloud computing, mobility, Internet of Things (IoT), machine learning, and big data analytics is currently revolutionizing the medical and healthcare industry in ways yet to be entirely understood by most practitioners, academicians, and researchers alike. The increasing availability of biomedical information and its correspondingly high growth rate is driving us quickly to the future where personalized medicine is not only possible but will significantly help to raise the global level of life expectancy in general. Today, a big data analytics lifecycle starting with data produced using machine learning and artificial intelligence-enabled genomic sequencing technologies, to intelligent data translation and correlations, and the aggregation of a possible report for clinicians, pharmacologists, and other related researchers is becoming easily accessible. The ability to be able to collect, analyze, translate, correlate and compare large amounts of data using innovative algorithms with the chance to merge all this information within a seamless cloud analytic environment for further studies is one of the fundamental driving forces for this change.


Gartner: Top 10 strategic technology trends in 2020


The democratisation of technology means providing people with easy access to technical or business expertise without extensive or expensive training. Already referred to as “citizen access”, this trend will focus on four key areas: application development, data and analytics, design, and knowledge. Democratisation is expected to see the rise of citizen data scientists, programmers and other forms of DIY technology engagement. For example, it could enable more people to generate data models without having the skills of a data scientist. This would, in part, be made possible through AI-driven code generation. The controversial trend of human augmentation focuses on the use of technology to enhance an individual’s cognitive and physical experiences. It comes with a range of cultural and ethical implications.


Tech and Advocacy: How Today’s Youth Are Speaking Out


Technology is giving us back a little bit of that feeling of an “ongoing public forum.” Young people today have the benefit of exposure to cultural and political discussions and controversies early on. Exposure to a well-rounded collection of ideas and perspectives early in life is essential for personal development. Moreover, developing resilience requires community, relationships, and a sense of shared difficulties (or even trauma). Social media and technology can provide these things under the right circumstances. The accessibility of platforms has led to an abundance of perspectives. Young people are quick to offer their views, but they also look for organizations and brands that are equally authentic about their values. Some companies even make careers out of creating advocacy-centered content for traditional and social media channels. Public advocacy is “cool” now — and so is the sharing of opinions once thought off-limits.


Will the US Get a Federal Privacy Law?

Will the US Get a Federal Privacy Law?
In the latest attempt at building a consensus, the House Energy & Commerce Committee recently unveiled a preliminary draft of a bipartisan consumer privacy bill. The committee is now seeking comments from privacy experts, trade associations and companies. The draft side-steps several of the most divisive issues, including whether a federal law should override state privacy laws and whether individuals should be empowered to sue companies over privacy violations. These two issues have led to months of stalled negotiations. Democrats, including Rep. Maria Cantwell, D-Wash., have argued in favor of not having a federal law supersede stronger state laws. They also favor allowing individuals to sue companies for privacy violations. Meanwhile Republicans, including Rep. Roger Wicker, R-Miss., have said they will not support a federal privacy law unless it pre-empts state legislation, such as CCPA, to create uniform rules for all to follow. They also argue against giving consumers the power to file privacy lawsuits, fearing that many frivolous litigations against companies could create an unnecessary burden.



Q&A on the Book EDGE: Value-Driven Digital Transformation

One of the most uncomfortable changes for senior leaders undergoing digital transformation is the change in performance measures. The most profound of these is the switch from internal ROI to external customer value. While this is a measurement change, it is more fundamentally a change in perspective, a change in your gut-level basis of decision making. It means the first and foremost question an executive leader asks is not "How will this impact our bottom line?" but "How will this impact the value we deliver to our customers?" ROI isn’t the objective; instead, it is a constraint. You need to make a profit to continue delivering customer value however; ROI is a business benefit (internal) but not a customer value (external). Complexity theory includes a concept called a fitness function. A fitness function summarizes a specific measure to evaluate how close a solution is to achieving a stated goal.



Quote for the day:


"Absolute identity with one's cause is the first and great condition of successful leadership." -- Woodrow Wilson


Daily Tech Digest - January 01, 2020


“As there have been more and more high-profile data breaches in recent years, this has translated to companies seeing the need for having people who can help them protect their data,” Stansell says. The field is projected to grow by 32% through 2028, according to the Bureau of Labor Statistics. (The average growth rate for all occupations is 5%.) However, the need for information security engineers has far eclipsed the number of people with the skills to do the job. The U.S. Department of Commerce recently estimated there are 350,000 unfilled cybersecurity jobs in the country. Cybersecurity Ventures, an analytics and research company, estimates 3.5 million jobs in cybersecurity around the world are likely to go unfilled by 2021. This talent shortage means workers in the field are paid handsomely. While Glassdoor data says information security engineers earn a median base salary of just over $100,000 per year, top-level jobs in the field, like chief information security officer, can yield pay above $300,000 in top metro areas, according to cybersecurity recruiting firm SilverBull.


Singapore tax on overseas digital services kicks in tomorrow

According to the Inland Revenue Authority of Singapore (IRAS), more than 100 providers of such services had enrolled under the city-state's Overseas Vendor Registration (OVR) regime, which meant they would begin charging GST on the sale of their digital services from tomorrow. The government agency defined digital services as services supplied online or an electronic network that required minimal or no human intervention and were "impossible without the use of information technology". Under the new regime, overseas digital service providers with a yearly global turnover of more than S$1 million and sold more than S$100,000 worth of digital services to customers in Singapore in a 12-month period were required to register for GST and charge GST. The new tax would not apply to online purchases of goods, IRAS said, noting that GST already was payable on goods--valued above S$400--imported into Singapore, via air or post. The government, however, had said it would continue to review the tax regime on such e-commerce transactions before determining how it should proceed.


big brother privacy eye data breach security binary valerybrozhinsky getty
A few things. One, merely having that extensive a range of PII in one place about one individual is dangerous. If a breach against Lookout does somehow happen — no security is perfect — it would be a bonanza for the cyberthief. From a security perspective, this company's marketing about this service could itself make identity and cyber thieves attracted to the site. They might spend extra resources and effort to break in, which is truly not what a customer wants. Two, it sends the wrong message. Privacy advocates rightly argue to never give anyone or any site more information than they absolutely need ("need to know" is appropriate here). And when the company is directly asking for such a gold mine of PII data (it was probably the passport data request that really sent me soaring), it makes people worried. What, people may wonder, if that page is a phishing page that was designed to merely look like a Lookout page? How is a user supposed to tell the difference? Three, in 2020 (OK, when we're this close to 2020), no company is an island. What if one of its employees turned to the dark side? What if the company you use for backup gets breached?



How a rules engine can drive -- or derail -- an IT automation strategy

In a perfect world, an IT team can set most types of rules quickly and, once in place, they mostly run on autopilot. But what happens with a distributed denial of service (DDoS) attack? Without limitations or additional conditions in place, a rules engine might spin up unlimited resources in the cloud to deal with a DDoS attack in the middle of the night when nobody is around to hit a kill switch -- which could lead to a massive bill from a cloud provider. Create additional conditions or rules to prevent an existing rule, or set of rules, from spinning out of control in an unplanned or unforeseen event. These additional rules can limit how many total resources are allocated in a given time frame, or where resources are available. Apply these additional rules to the original rules that are designed to create resources. This creates a collection of interacting rules that can limit or enable each other -- which, without careful management, can create a confusing spider web of rule sets that you can't untangle, much like a pile of cables hidden in a data center closet.


iiot.jpg
The report isn't only filled with good news for wearable manufacturers and their customers: There are a couple of roadblocks that could prevent the industry's growth. Data security and privacy issues are both mentioned as potential issues. There's reason to believe that these concerns are valid, too: Industrial IoT adoption, which arguably contains wearables, has made the manufacturing industry a ripe target for attack. As reported previously by TechRepublic, analysis of industrial networks found higher levels of malicious activity than was expected in 2018, indicating that attackers had already penetrated many networks and were conducting reconnaissance. The nature of IoT networks means that a lot of sensitive data is being passed between sensors and other connected devices, all of which could be harvested by an attacker. Adding wearables to the mix only gives attackers one more type of data to exploit. Along with data theft, privacy for those wearing the devices is at risk as well. If an attacker is able to harvest business data there's nothing stopping them from potentially stealing personal data about employees wearing connected devices either.


Parse Anything with Parsley: A Different Approach

Originally, I wrote the Slang parser using a hand built recursive descent parsing technique similar to that used by Microsoft's production C# compiler. However, maintenance is a bear, and even though the source is partitioned across several files it quickly becomes overwhelming. Unlike Microsoft, I don't have a team available to delegate building the different parts of the parser to. To that end, I needed a tool that would help build me a parser. I considered using ANTLR but I don't like its API or grammar format, and the grammar I found for C# generated an 800K+ source file for parsing C#6 - which wouldn't parse! No thanks. Moving on, I found a grammar for Coco/R as well, but it uses so much embedded state to resolve its parse that it's hard to follow, leading me to the same maintenance and comprehension issues as my hand written parser! For various reasons I'm sticking to LL parsing versus LR parsing so tools like Gppg are off the table, even if they could parse C#. If you don't know the difference between LL and LR parsing it doesn't matter here, but LL parses top down, using the grammar to direct the parse, while LR parses from bottom up, using the next input to direct the parse.


From crypto currency to chocolate – where to spend your Bitcoin


Italia Click is a international food distributor that embraces technologies such as Bitcoin and uses them to accept payments. In November it added crypto to its payment options and will accept payment in several different crypto currencies such as Bitcoin SV, Etherium, and Ripple. I can vouch for the deliciousness of the chocolate! Crete-based 35North sells extra virgin olive oil from where the 35th parallel north crosses the mountains of Crete. Although its Twitter account states that FIAT and crypto payments are accepted, the online shop only offers the choice to pay by card or PayPal. You will need to make a special request for your crypto to be accepted via the online store. Hot Hogs BBQ is a food truck in Keen, NH which gets enthusiastic reviews from customers and won WMUR Best Barbecue in NH 2019. It accepts Bitcoin, dash, BitcoinSV, and Bitcoin cash for your BBQ, but hold fire before hot footing it over to NH – Hot Hogs is now closed until the spring. The Lucky Hot Dog is a food truck in Chicago, IL serving dogs, burgers, beef and chicken. Payment by Bitcoin, card or cash.


Book Review: Enterprise API Management

Any API initiative should start by identifying the business drivers for such endeavor, such as what are the benefits targeted and why, how will these targets be measured once delivered, and what return can be expected on the investment (ROI) (naturally leading to the question, what is the investment required in the first place!). At this stage, it is less about the technicalities of APIs and more about the business itself. Therefore, it is empirical to elaborate an understanding of the business domain, its language, its key stakeholders, and how it operates. Doing so can help in identifying ways on how APIs can help, or in other words, the business drivers? Once such business drivers are identified, then it’s a matter of presenting them in a comprehensive way to the right stakeholders. At this point, communication and presentation skills matter a lot, and so does the language used. A lot can go wrong at this point. If the presentation is too technical, business professionals will struggle to understand the value of what’s been offered.



CGM technology could ultimately be rolled out to people with type 2 diabetes, and those in the pre-diabetic range – people whose blood sugar is higher than normal but not yet diabetic. The idea of getting people who aren't on insulin to use CGM is to show the impact of certain foods and behaviours on their glucose levels, and so help them to keep levels in the right range. "Until you really have direct feedback, it can be hard to really understand why that's important," Leach says. "I think there can be a whole lot more around that coaching or the advice or the analytics that you put around the data to help you get more people with either pre-diabetes, or even with just general health and wellness. I think there's a lot of opportunity and there's quite a few pilots we're entering into any different areas to learn more about what works for those users." For the traditional CGM user base – people who control their diabetes with insulin injections – the next few years are likely to bring another sea-change in technology with the advent of closed-loop or 'artificial pancreas' systems, single units that both monitor glucose and deliver insulin accordingly.



5G will change the world - but who will keep it safe?

A newly installed 5G antenna system made by Ericsson for the AT&T's 5G wireless network is shown high atop a building in downtown San Diego, California, U.S., April 23, 2019
Robust security will need to be designed into both devices and network equipment from the outset, with a continuous product security lifecycle in place to manage it, as well as a secure software development lifecycle. The networks of the future will be largely virtualized, software-based networks. This means they will be difficult to test as verifiably secure at any point in time. Governments and others such as large businesses who interact with a range of other suppliers and networks will need to consider how appropriate incentives are put in place throughout the supply chain to encourage effective consideration of security in the development and operation of new networks. To ensure interoperability around the globe and to truly realize the benefits of the Fourth Industrial Revolution, governments will also need to consider how they can promote more international approaches to securing and building trust in next-generation networks.



Quote for the day:


"Coaching is a profession of love. You can't coach people unless you love them." -- Eddie Robinson


Daily Tech Digest - December 31, 2019

How to proceed with deep legacy hardware

istock-1025444552cloudomputing.jpg
"For folks that have applications that are linked to the hardware environment, it's very different for them to get off it. So we'll work with clients especially when they go beyond the end-of-service life," O'Grady said. "It's mainly government, discrete manufacturing, and banking." The equipment is stored at a former DEC manufacturing facility in Salem, NH. Three former DEC-hands work there as technicians. When customers send in hardware for repair, re-homing, or recycling, "They have a little fun to see if their technician ID is on that machine," O'Grady joked. "Any equipment that's demand-constrained, or supply-constrained in the market, we'll keep it here... I don't think there's anything we haven't been able to find for clients," he said, adding that sometimes he works with museums and related organizations for assistance. "The VAX 6000, these are 30-year-old machines. We've got more than several clients that we're helping out long-term. Everyone does not have enough budget dollars to go around to innovate in new technology," so they focus on stabilizing what already works, he explained. "As long as they can keep the hardware environment viable then it works for them."



Ramp up carefully during AIOps implementation


IT organizations can't simply inject an AIOps tool into their monitoring and management roster and expect positive results. Instead, they need to prep IT workflows and infrastructure for an AI-driven strategy. "The first place that IT leaders start their AI journey tends to be process automation," said Chirag Dekate, a Gartner analyst. IT automation itself doesn't equal AIOps, but it propels organizations in the right direction, as it eliminates menial and repetitive tasks for IT staff. First ensure existing IT automation scripts function as they should, Dekate said. Streamlined data management and collection is another prerequisite for AI in IT operations, according to Ari Silverman, director of platform automation and enterprise architecture at OCC, an equity derivatives clearing organization in Chicago. Silverman's team uses LogicMonitor as an AIOps monitoring tool, primarily for predictive analytics and automated capacity planning and management.


The Best Mesh Routers Of 2019

Netgear Orbi RBK13
Netgear has ditched the towers in their latest iteration of the standard Orbi mesh system. The updated system is rectangular, with waves on top that cleverly hide circulation vents to keep the devices cool. It's a solid system, able to cover up to 6,000 square feet with 1.2Gbps of wireless goodness (if you choose a 4-pack). The only bad thing is that, you guessed it, the app is a bit of a mess. Once you get it setup and running, it's a solid system, but getting there can be an exercise in patience. The base system won't see the satellites, it takes forever for setup steps to complete. In a market with "instant" networks, it's a major gaff. It's a good thing, then that the Orbi system is less expensive than most mesh systems. If you have a large area of space to cover, this is the cheapest way to do it. Where the Orbi WiFi 6 system dominated via networking power, the RBK13 wins by being the cheapest way to get mesh networking into your home. You can get a router and two satellite system for under $200.


What Digital Transformation Is (And Isn't)

Over the past few years, enterprise leaders have become captivated by the idea of digital transformation. Perhaps that shouldn't be surprising given all the hype from analysts and vendors. These days it’s tough to find an enterprise technology product that doesn't advertise itself as a key ingredient in digital transformation. And expert analysis is full of promises that sometimes seem too good to be true. ... Using hardware, software, algorithms, and the Internet, it's 10 times cheaper and faster to engage customers, create offerings, harness partners, and operate your business." That kind of promise is certainly enticing. But it's tough to find agreement on what exactly "digital transformation" means. For some organizations, it just means getting into ecommerce. For others, it involves doing away with paper-based processes and becoming more efficient. Still others are embracing cloud computing, DevOps, automation, the Internet of Things (IoT), and artificial intelligence (AI) to become more competitive. And many seem to be doing most of this and more.


The Top 5 Fintech Trends Everyone Should Be Watching In 2020

The Top 5 Fintech Trends Everyone Should Be Watching In 2020
One of the latest “big things” in fintech is the growth of the mobile payments industry. Consumers want payments to be instant, invisible, and free (IIF). Mobile payment innovations might even do away with our traditional wallets as global consumers are less reliant on cash. Google, Apple, Tencent, and Alibaba already have their own payment platforms and continue to roll out new features such as biometric access control, inducing fingerprint, and face recognition. One of the most popular payment methods in China and used by hundreds of millions of users every day is WeChat Pay. Alibaba’s Alipay, a third-party online and mobile payment platform, is now the world’s largest mobile payment platform. Many mobile payment platforms are building programs and offers based on the user’s purchase history. While many financial institutions are continuing to adopt new technology to enhance operations and improve customer service, these five trends will provide exciting avenues for innovation. Financial institutions realize they must learn how to use fintech to their competitive advantage.


What Is Tech Debt and How to Explain It to Non-Technical People?

We can distinguish at least three types of technical debt. Even if developers don’t compromise on quality and try to build future-proof code, the debts can arise involuntarily. This can be provoked by the constant changes in the requirements or the development of the system. Your design turned out to be flawed and you can’t add new features quickly and easily but it wasn’t your fault or decision. In this case, we’re talking about accidental or unavoidable tech debt. The second type of technical debt is deliberate debt that appears as a result of a well-considered decision. Even if the team understands that there is a right way to write the code and the fast way to write the code it may go with the second one. Often. it makes sense – as in the case with startups aimed at delivering their products to market very quickly to outpace their competitors. Finally, the third type of tech debt refers to situations when developers didn’t have enough skills or experience to follow the best specific practice that leads to really bad code. The bad code can also appear when developers didn’t take enough time and effort to understand the system they are working with, miss things or vice versa perform too many changes.


Experts' cloud predictions for 2020


Not many cloud predictions matter to the general populace, but this one about the power of AI affects everyone. In 2020, explainable AI will rise in prominence for cloud-based AI services -- particularly as enterprises face pushback around the ethical issues of AI. Explainable AI is a technology that provides justification for the decision that it reaches. Both Google and Microsoft have launched explainable AI initiatives, currently in early stages. Amazon is likely to introduce some explainable AI capabilities as part of its AI tools. Through the power of deep learning, data scientists can build models to predict things and make decisions. But this trend can result in black-box algorithms that are difficult for humans to make sense of. The biggest challenge enterprises face is the need to track bias in AI models and identify cases where models lose accuracy.


Wanted: More types of machine learning

Wanted: More types of machine learning
The issue for me is that the ML groups I’ve mentioned are perhaps limiting. Consider a dynamic combining of all types, with adjusting the approach, type, or algorithm during the processing of the training data, either mass loads or transactions. At issue is use cases that don’t really fit these three categories. For example, we have some labeled data and unlabeled data, and we’re looking for the ML engine to identify both the data itself and patterns in the data. Most of us don’t have perfect training data, and it would be nice if the ML engine itself could sort things out for us. With a few exceptions, we have to pick supervised or unsupervised learning and only solve a portion of the problem, and we may not have the training data needed to make it useful. Moreover, we lack the ability to provide reinforcement learning as the data is used within transactional applications, such as identifying a fraudulent transaction ongoing. There are ways to create an “all of the above” approach, but it entails some pretty heavy-duty work for both the training data and the algorithms.


5 open source innovation predictions for the 2020s

Open Source
AI and machine learning have powered these innovations and many of the AI advancements came about thanks to open source projects such as TensorFlow and PyTorch, which launched in 2015 and 2016, respectively. In the next decade, Ferris stressed the importance of not just making AI smarter and more accessible, but also more trustworthy. This will ensure that AI systems make decisions in a fair manner, aren't vulnerable to tampering, and can be explained, he said. Open source is the key for building this trust into AI. Projects like the Adversarial Robustness 360 Toolkit, AI Fairness 360 Open Source Toolkit, and AI Explainability 360 Open Source Toolkit were created to ensure that trust is built into these systems from the beginning, he said. Expect to see these projects and others from the Linux Foundation AI — such as the ONNX project — drive the significant innovation related to trusted AI in the future. The Linux Foundation AI provides a vendor-neutral interchange format for deep learning and machine learning.


private-sign-red-door
What’s interesting is how the HIPAA Security Rule also governs the physical aspect of ePHI and healthcare information systems. Not many information security standards go as deep as HIPAA when it comes to maintaining the physical security of information. The physical facility used to store ePHI needs to have sufficient security measures. Only authorized personnel are allowed access to the hardware and terminals connected to the healthcare information systems. Unauthorized access is considered a serious violation of the HIPAA standard. Logging is also a part of the physical safeguard. Access to terminals and servers must be logged in detail to prevent unauthorized access and allow for an easy audit of the secure facility. Logging on a physical level helps the entire system remain safe. There is also the need for secure devices and terminals, including secure tablets that are now used by medical personnel. It is up to the healthcare service providers to maintain a secure network across their facilities. To complete the equation, policies for hardware disposal and the termination of a healthcare information system must also be put in place.



Quote for the day:


-"Leaders think and talk about the solutions. Followers think and talk about the problems." -- Brian Tracy


Daily Tech Digest - December 30, 2019

Doing the right thing: The rise of ethics in tech

Doing the right thing: The rise of ethics in tech header
"Culture means a lot of things," Schlesinger continued. "Culture in the broadest terms—in terms of tools, processes, norms, narratives—is bringing all of the things that ladder up to creating the kind of organization that is needed to then build the kind of products, features, and tools that society can benefit from." Today, we're at the point with ethics in technology that we were with automobiles in 1966, he noted, after Ralph Nader's 1965 book, "Unsafe at Any Speed," exposed and heightened awareness around the dangerous engineering practices involved in building cars at the time, resulting in new safety initiatives. "We've all awakened, and it's kind of unique that we're even having this conversation at a mainstream tech conference," Schlesinger said. "We're at the beginning stages of this evolution toward that kind of informed, just, rewarding culture in tech that holds itself accountable for the kinds of things we want to build. And ultimately, that is about showing our moral math." However, Paula Goldman, chief ethical and humane use officer at Salesforce, argued that we're not in 1966 but the early 1900s, with its waves of innovation and new norms.



Financial Services Could Never Do This Before

Financial Services Could Never Do This Before
The cloud offers a tremendous new opportunity to scale your infrastructure on-demand and offload some of the expense of data management, especially as it relates to new workloads or testbed environments. Yet the reality is, for most financial services institutions, much of the data resides on-premise in data centers and will continue to for a long time – dictated by regional jurisdictions, data security concerns or just historical preference to control the data. Financial services organizations need a new approach. Flexibility to manage data across environments is critical. Today, organizations need an enterprise data cloud that offers the ability to ingest, process, store, analyze, model any type of data (structured, unstructured, or semi-structured data), regardless of where it lands — at the edge, on premise, in the data center, or in any public, private, or hybrid clouds.


GDPR: Moving Beyond Compliance


While more organisations move to develop a senior leadership approach to data privacy, in the year and a half since GDPR, a growing number of businesses are trying to put data privacy on the radar of their entire employee base. In these organisations, it is becoming everyone’s mission to have an understanding of provenance and the use of information, with everyone taking accountability for how the organisation collects, uses, and shares personal information. The idea of accountability is that “we say what we do and we do what we say” and, importantly, “we stand by doing what we do.” This culture of accountability is something that is also being extended to how organisations talk to their customers about data privacy. Increasingly, businesses are being open and inclusive, telling customers about what they are doing with personal information and how they are protecting it. In doing so, they recognise the need to close the gap in terms of the expectations, responsibilities, and actions relevant to privacy protections and information ethics. With big data breaches, such as recent ones that exposed the data of almost 400 million people, it is no wonder that the general public is becoming wary about parting with their personal information.


Cisco 2020: Challenges, prospects shape the new year


Cisco is attacking the cloud provider market by addressing its hunger for higher bandwidth and lower latency. At the same time, the vendor will offer its new technology to communication service providers. Their desire for speed and higher performance will grow over the next couple of years as they rearchitect their data centers to deliver 5G wireless services to businesses. For the 5G market, Cisco could combine Silicon One with low-latency network interface cards from Exablaze, which Cisco plans to acquire by the end of April 2020. The combination could produce exceptionally fast switches and routers to compete with other telco suppliers, including Ericsson, Juniper Networks, Nokia and Huawei. Startups are also targeting the market with innovative routing architectures. "Such a move could give Cisco an edge," said Tom Nolle, president of networking consultancy CIMI Corp., in a recent blog.


Don’t Let Impostor Syndrome Derail Your Next Interview


Even when you’re well prepared for an interview and know that you’re perfectly qualified for the job, it can still be a nerve-racking experience to walk into a room full of strangers and prepare to be judged. To manage your jitters, start by controlling the controllable elements of your interview experience. If you’re worried about arriving punctually, for example, try taking multiple routes to your destination before the day of the interview to see which one gets you there fastest, with the least amount of traffic. Managing nervousness around the interview itself is another area where you can be proactive. In Cliff’s case, he decided to build in extra time before the interview for a 10-minute walk around the block. During this scheduled pre-meeting stroll, Cliff planned to focus on deep breathing to help ratchet down his stress response. I recommended that while walking, he take a minute or two to inhale for a count of four seconds, hold his breath for two seconds, and then exhale for a count of four seconds. He found this process deeply calming, and it allowed him to enter the interview setting feeling more confident and settled.


5 Lessons George Lucas Taught Us About Innovation

Image: Pixabay
Experimentation is an important part of any innovative team. In 1979, Lucas created The Graphics Group as part of Lucasfilm’s computer division and hired Edwin Catmull to lead it. The goal of this group was to invent new digital production tools for use in live action films. They were successful in this goal and even created software used in medical and satellite imagery. However, Catmull’s team really longed to create full-length computer-generated imagery (CGI) animated films. As they struggled to build a profitable business, neither were achieving their goals. Lucas put it this way, “I didn’t want to run a company that sold software, and John [Lasseter] and Ed wanted to make animated films.” Eventually it became clear that… Someone had to be the first to push the boundaries of blending CGI and live action beyond short special-effects shots. No longer was storytelling constrained by the limitation of a human actor. This risk gave other filmmakers a platform to build from, slowly crafting new characters to where we are today; where CGI characters are nearly indistinguishable from human ones.


The Evolution Of Data Protection

Photo:
DLP is only as good as the classification rigidity enforced by the organization. Classification is always too rigid and can't keep up with fluid data movement. For DLP to prevent data from egress, data must be classified correctly. Classification is complicated and fragile. What is sensitive today is not sensitive tomorrow and vice versa. Classification turns into an endless battle of users trying to manage the classification of data. Ultimately, classification and DLP deteriorate over time. DLP adds an extremely high operational overhead, as it requires users to be classification superstars, and even then, mistakes will happen. Desjardins Group, a Canadian bank, recently made news for a malicious insider who obtained information on 2.7 million customers and over 170,000 businesses. The exact details of the breach haven't been made public yet, but DLP solutions are standard in all financial institutions. PGP's encryption is a privacy tool. Users can encrypt their data so others can't access it, but PGP fails once users try to share data with other users.


California’s privacy law means it’s time to add security to IoT

California law requires IoT to have security.
If you think about the evolution of the marketplace, we’re at a state now where the technology has gotten us to a certain point. We have connectivity. Wi-Fi has gotten to a point where it’s ubiquitous. Access to the internet is pretty pervasive around the world. That’s spun this billions-of-units vision, saying that everything is going to be connected. That’s interesting, and from a technology perspective, we’re seeing that in our houses. We see the ubiquity of these connected devices in our homes. But what quickly happens is you get what I call a normative period where societal issues come to the fore, the biggest one being privacy. You go into this normative period now where everyone says we need privacy, and then you have to have some sort of governance over the devices to create an environment where you can deliver that capability in a cost-effective way. What I’m saying specifically, as it relates to privacy today, is that there are no standards. There is no threshold. Therefore, these devices can be anywhere from having zero security capability to everything in between, across the spectrum.


IoT vendor Wyze confirms server leak

Wyze
Song confirmed that the leaky server exposed details such as the email addresses customers used to create Wyze accounts, nicknames users assigned to their Wyze security cameras, WiFi network SSID identifiers, and, for 24,000 users, Alexa tokens to connect Wyze devices to Alexa devices. The Wyze exec denied that Wyze API tokens were exposed via the server. In its blog post, Twelve Security claimed they found API tokens that they say would have allowed hackers to access Wyze accounts from any iOS or Android device. Second, Song also denied Twelve Security's claims they were sending user data back to an Alibaba Cloud server in China. Third, Song also clarified Twelve Security claims that Wyze was collecting health information. The Wyze exec said they only collected health data from 140 users who were beta-testing a new smart scale product. Song didn't deny Wyze collected height, weight, and gender information. He did, however, deny others. "We have never collected bone density and daily protein intake," the Wyze exec said. "We wish our scale was that cool."


How one bizarre attack laid the foundations for the malware taking over the world


The first instance of what we now know as ransomware was called the AIDS Trojan because of who it was targeting – delegates who'd attended the World Health Organization AIDS conference in Stockholm in 1989. Attendees were sent floppy discs containing malicious code that installed itself onto MS-DOS systems and counted the number of the times the machine was booted. When the machine was booted for the 90th time, the trojan hid all the directories and encrypted the names of all the files on the drive, making it unusable. Victims saw instead a note claiming to be from 'PC Cyborg Corporation' which said their software lease had expired and that they needed to send $189 by post to an address in Panama in order to regain access to their system. It was a ransom demand for payment in order for the victim to regain access to their computer: that made this the first ransomware. Fortunately, the encryption used by the trojan was weak, so security researchers were able to release a free decryption tool – and so started a battle that continues to this day, with cyber criminals developing ransomware and researchers attempting to reverse engineer it.



Quote for the day:


"The leader has to be practical and a realist, yet must talk the language of the visionary and the idealist." -- Eric Hoffer


Daily Tech Digest - December 29, 2019

Are we running out of time to fix aviation cybersecurity?

cockpit airline airplane control pilot by southerlycourse getty
Flying remains one of the safest ways to travel, and that's due in large part to continuous efforts to improve air safety. Cultural norms in aviation have rewarded and incentivized a whistleblowing culture, where the lowliest mechanic can throw a red flag and stop a jet from taking off if he notices a potential safety issue. Contrast that with the often-fraught issue of reporting security vulnerabilities, where shame and finger-pointing and buck passing are the norm. The report highlights the problem, writing, "Across much of the cybersecurity landscape, there arguably remains a stigma about discussing cybersecurity vulnerabilities and challenges that go beyond managing sensitive vulnerabilities." A wormable exploit or a backdoored software update — like the backdoored MeDoc software update that started the Petya worm — could cause safety issues at scale. It’s unclear that the aviation industry’s traditional safety thinking is sufficient to meet this challenge. For instance, the report calls out the need for greater information sharing on aviation cybersecurity threats, acknowledging the risk of a Maersk-like scenario and observing rather drily that "other sectors have seen the scale and costs from a single vulnerability and 'wormable' exploit.



AI vs. Machine Learning: Which is Better?


Artificial intelligence came from the word “Artificial” and “Intelligence. Artificial means it is created by a non-natural thing or a human, and intelligence means the ability to think and understand things. Some people think artificial intelligence is a system, but the fact is, it’s within the system. AI has stipulated rules that were pre-determined by an algorithm that was set by a person. The appearance of AI is more often on smartphones, desktop computers, and smartwatch. ... Machine learning is capable of learning from itself. It is a computer system than can adopt knowledge and solve a problem based on its experience. The ML acts on the provided data that was inputted by humans and predicted accurate solutions based on the information that was gathered by the machine/computer. Machine learning has a different algorithm for artificial intelligence. The machine learning algorithm is capable of deciding on its own. The artificial intelligence is capable of answering a pre-determined question with a pre-determined solution.


How AI, Analytics & Blockchain Empower Efficient & Intelligent Supply Chain?


Regulating its promise to disrupt every industry for better, AI is transforming supply chain management as well. The technology has a number of applications in the supply chain which include extraction of information, analysis of data, planning for supply and demand, and better management for autonomous vehicles and warehouses. AI-enabled NLP scans through the supply chain documents like contracts, purchase orders, chat logs with customers or suppliers and significant others to identify commonalities which are used as feedback to optimize SCM as part of continual improvement. ML helps people manage the flow of goods throughout the supply chain while ensuring that raw materials and products are in the right place at the right time. Also, technology can source and process data from different areas and forecast future demand based on external factors. And most importantly, AI helps analyze warehouse processes and optimize the sending, receiving, storing, picking and management of individual products.


Netgear Nighthawk M2 Mobile Router, hands on

netgear-m2-on-table.jpg
Very much designed as a 'travel router' the square lozenge of the M2 measures 105mm by 105mm by 20.5mm and weighs 240g. It's easy to slip into a briefcase or bag when you're travelling, and won't weigh you down.  Like its M1 predecessor, the M2 relies on 4GX LTE mobile broadband, as Netgear argues that 5G networks aren't sufficiently widespread to justify the extra cost of adding 5G support. However, Category 20 4GX LTE support means that the M2 doubles its maximum download speed from 1Gbps to 2Gbps, although the upload speed remains the same at 150Mbps. It then uses dual-band 802.11ac to create its own wi-fi network, which can support connections from up to 20 separate devices. The M2 also gains a larger 2.4-inch touch-sensitive display that allows you to quickly configure the router, and to monitor signal strength, data usage and other settings. The Netgear Mobile app provides similar controls for Android and iOS devices, and there's a browser interface available for computers as well.


What is Jenkins? The CI server explained

What is Jenkins? The CI server explained
Today Jenkins is the leading open-source automation server with some 1,400 plugins to support the automation of all kinds of development tasks. The problem Kawaguchi was originally trying to solve, continuous integration and continuous delivery of Java code (i.e. building projects, running tests, doing static code analysis, and deploying) is only one of many processes that people automate with Jenkins. Those 1,400 plugins span five areas: platforms, UI, administration, source code management, and, most frequently, build management. Jenkins is available as a Java 8 WAR archive and installer packages for the major operating systems, as a Homebrew package, as a Docker image, and as source code. The source code is mostly Java, with a few Groovy, Ruby, and Antlr files. You can run the Jenkins WAR standalone or as a servlet in a Java application server such as Tomcat. In either case, it produces a web user interface and accepts calls to its REST API. When you run Jenkins for the first time, it creates an administrative user with a long random password, which you can paste into its initial webpage to unlock the installation.


Process Mining vs. Business Process Discovery

Harvard Business Review – a publication that’s unfortunately becoming increasingly political by the day – published an article about process mining earlier this year, written by two individuals who have been involved with the field for four decades now. According to the experts, process mining solves a few fundamental challenges associated with business process management. These are: Companies tend to spend too little time or too much time analyzing “as is” business processes; and There is a lack of connections between business processes and an organization’s enterprise information systems. Starting with the first bullet point, we’d argue that for most companies, if you can’t figure out the optimal time to spend analyzing an existing business process, hire better BPAs. The second bullet point describes the inability to capture the “interoperability’ of processes and information systems. Fair enough. Organizations are incredibly complex entities, and one department might interact with 100s of internal systems. Enter process mining and a German company called Celonis.


Azure Cosmos DB — A to Z


Behind the scenes, Cosmos DB uses distributed data algorithm to increase the RUs/performance of the database, every container is divided into logical partitions based on the partition key. The hash algorithm is used to divide and distribute the data across multiple containers. Further, these logical containers are mapped with multiple physical containers (hosted on multiple servers). Placement of Logical partitions over physical partitions is handled by Cosmos DB to efficiently satisfy the scalability and performance needs of the container. As the RU needs increase, it increases the number of Physical partitions (More Servers). As the best practice, you must choose a partition key that has a wide range of values and access patterns that are evenly spread across logical partitions. For example, if you are collecting some data from multiple schools, but 75% of your data is collected from one school only, then, it’s not a good idea to create the school as the partition key.


Digital process automation vs. robotic process automation

Digital process automation can also be easily confused with another similar term: robotic process automation. Robotic process automation (RPA) uses more intelligent automation technology -- such as artificial intelligence (AI) and machine learning (ML) -- to handle high-volume, repeatable tasks. RPA can be used to automate queries and calculations as well as maintain records and transactions. This is typically done using bots such as probots, knowbots or chatbots. What distinguishes RPA from other forms of IT automation is the ability of the RPA software to be aware and adapt to changing circumstances, exceptions and new situations. Whereas DPA comes from BPM, and BPA comes from infrastructure management, RPA is not considered a part of the infrastructure. Instead, RPA sits on top of an organization's infrastructure; this allows an organization to quickly implement a digital process technology.


Data governance & retention in your Microsoft 365 tenant

Image showing workers in an office.
Data governance has relied on transferring data to a third-party for hosting an archive service. Emails, documents, chat logs, and third-party data (Bloomberg, Facebook, LinkedIn, etc.) must be saved in a way that it can’t be changed and won’t be lost. Data governance is part of IT at the enterprise level. It serves regulatory compliance, can facilitate eDiscovery, and is part of a business strategy to protect the integrity of the data estate. However, there are downsides. In addition to acquisition costs, the archive is one more system that needs ongoing maintenance. When data is moved to another system, the risk footprint is increased, and data can be compromised in transit. An at-rest archive can become another target of attack. When you take the data to the archive, you miss the opportunity to reason over it with machine learning to extract additional business value and insights to improve the governance program. The game changer is to have reliable, auditable retention inside the Microsoft 365 tenant.


Top 6 Software Testing Trends to Look Out in 2020

Despite the promising prospects of AI/ML application in software testing, experts still regard AI/ML in testing is still in its infancy stage. Therefore, it remains numerous challenges for the applications of AI/ML in testing to move on to the maturity level. The rising demands for AI in testing and QA teams signal that it’s time for Agile teams to acquire AI-related skill sets, including onboarding data science, statistics, mathematics. These skill sets will be the ultimate complementation to the core domain skills in test automation and software development engineering testing (SDET). Additionally, successful testers need to adopt a combination of pure AI skills and non-traditional skills. Indeed, last year, a variety of new roles have been introduced such as AI QA analyst or test data scientist. As for automation tool developers, they should focus on building tools that are practical. Companies are utilizing PoCs and reassessing options to make the best use of AI and considering budgets.



Quote for the day:


"Everyone wants to be appreciated. So if you appreciate someone, don't keep it a secret." -- Mary Kay Ash