Daily Tech Digest - February 19, 2019

The role of open source in networking

hello my name is open source nametag
The biggest gap in open source is probably the management and support. Vendors keep making additions to the code. For example, zero-touch provision is not part of the open source stack, but many SD-WAN vendors have added that capability to their product. Besides, low code/no code coding can also become a problem. As we now have APIs, users are mixing and matching stacks together and not doing raw coding. We now have GUIs that have various modules which can communicate with a REST API. Essentially, what you are doing is, you are taking the open source modules and aggregating them together. The problem with pure network function virtualization (NFV) is that a bunch of different software stacks is running on a common virtual hardware platform. The configuration, support, and logging from each stack still require quite a bit of integration and support. Some SD-WAN vendors are taking a “single pane of glass” approach where all the network and security functions are administered from a common management view.



The Internet Has A New Problem: Repeating Random Numbers!

To understand the problem it’s useful to take a 30 second tutorial on Digital Certificates. For those of you who might have managed to stay awake during math class you’ll remember that Asymmetric Cryptography utilizes 2 prime numbers to create a Public and Private Key for a Digital Certificate. The Public Key maps an input (that you want to keep secret) to a large number field while the Private Key reverses the transaction. The theory goes that since there’s an infinite set of prime numbers, there’s an infinite set of Public/Private key combinations. To make sure the prime numbers are different a Random Number Generator (RNG) is used. Sounds pretty secure. Infinite is a big number. What could go wrong? Well the real world is a bit different than math class. It seems the random number generators (RNG) on computer devices really don’t generate an infinite set of primes but rather a bounded set that in turn generates a set of Public/Private Key combinations.


Understanding the Darknet and Its Impact on Cybersecurity


Uses of the darknet are nearly as wide and as diverse as the internet: everything from email and social media to hosting and sharing files, news websites and e-commerce. Accessing it requires specific software, configurations or authorization, often using nonstandard communication protocols and ports. Currently, two of the most popular ways to access the darknet are via two overlay networks. The first is the aforementioned Tor; the second is called I2P. Tor, which stands for “onion router” or “onion routing,” is designed primarily to keep users anonymous. Just like the layers of an onion, data is stored within multiple layers of encryption. Each layer reveals the next relay until the final layer sends the data to its destination. Information is sent bidirectionally, so data is being sent back and forth via the same tunnel. On any given day, over one million users are active on the Tor network. I2P, which stands for the Invisible Internet Project, is designed for user-to-user file sharing. It takes data and encapsulates it within multiple layers. 


Tap compare is not something that attempts to act as a direct substitute for any other testing technique -- you will still need to write other kinds of tests such as unit tests, component tests or contract tests. However, it can help to you detect regressions so that you can feel more confident about the quality of the new version of the developed service.  But one important thing about tap compare is that it provides a new layer of quality around your service. With unit tests, integration tests, and contract tests, the tests verify functionality based on your understanding of the system, so the inputs and outputs are provided by you during test development. In the case of tap compare, this is something totally different. Here, the validation of the service occurs with production requests, either by capturing a group of them from the production environment and replaying them against the new service, or by using the mirroring traffic technique where you shift production traffic to be sent to both the old version and to the new version, and you compare the results.


Why are IoT platforms so darn confusing?

Why are IoT platforms so darn confusing?
An Internet of Things (IoT) platform is the support software that connects edge hardware, access points, and data networks to other parts of the value chain (which are generally the end-user applications). IoT platforms typically handle ongoing management tasks and data visualization, which allow users to automate their environment. You can think of these platforms as the middleman between the data collected at the edge and the user-facing SaaS or mobile application. That last line is key because to me, an IoT platform is little more than a fancy name for the middleware that connects everything together. i-Scoop focuses on that aspect: “An IoT platform is a form of middleware that sits between the layers of IoT devices and IoT gateways (and thus data) on one hand and applications, which it enables to build, on the other.” Perhaps, though, IoT platform vendor KAA offers the most honest description. While acknowledging the middleware aspect, the vendor also allows that “an IoT platform can be wearing different hats depending on how you look at it.”


Kaspersky Lab Launches New Threat Intelligence Service

Kaspersky Lab Launches New Threat Intelligence Service
Sergey Martsynkyan, Head of B2B Product Marketing at Kaspersky Lab, provided some insights into the new threat intelligence service. “Being aware of the most relevant zero-days, emerging threats and advanced attack vectors is key to an effective cybersecurity strategy.” “However, manually collecting, analyzing and sharing threat data doesn’t provide the level of responsiveness required by an enterprise. There’s a need for a centralized point for accessible data sources and task automation.”  According to Kaspersky Lab, one-third of enterprise CISOs feel overwhelmed by threat intelligence sources. Moreover, they also tend to struggle with connecting their threat intelligence with their SIEM solution. The Kaspersky CyberTrace indicates the blurring lines between the different disciplines of cybersecurity; in addition, the new service highlights the growing importance of threat detection and remediation in the modern cybersecurity paradigm; a prevention-based model often leaves cyber-attacks to dwell on enterprise networks and wreak havoc in the digital background.


RPA: the key players, and what’s unique about them

RPA: the key players, and what̢۪s unique about them image
Of all the key specialist key players in RPA, Blue Prism is the only company listed on the stock market. We spoke to Pat Geary, the company’s Chief Evangelist. Pat has an interesting claim to fame in this space, for it was he who first came up with the phrase RPA. Blue Prism puts a quite different emphasis on RPA, indeed it goes further and argues that a lot of the ‘claimed’ players in the RPA space are not actually RPA companies at all — rather they sell what he calls RDA — robotics desktop automation. When we spoke to Mr Geary, he put emphasis on the word guardian: Operational security guardians, resilience and backup guardians, audit guardians and governance guardians.” He says that what he calls RDA bypasses these guardians — “sneaking stuff in without passing the guardians.” He says Blue Prism as providing a fortress RPA, “it’s absolutely bullet-proof,” he says. He likens the Blue Prism solution to a padded room — an area that is safe, allows for experiment, the “business can do whatever they like in there, but they’re not going to break anything.”


Three Pillars with Zero Answers: Rethinking Observability with Ben Sigelman

The big challenge associated with metrics is in dealing with high cardinality. Graphing metrics often provides visibility that allows humans to understand that something is going wrong, and then the associated metric can often be explored further by diving-deeper into the data via an associated metdata tag e.g. user ID, transaction ID, geolocation etc. However, many of these tags have high cardinality, which presents challenges for querying data efficiently. The primary challenge with logging is the volume of data collected. Within a system based on a microservices architecture, the amount of logging is typically the multiple of the number of services by transaction rate. The total cost for maintaining the ability to query on this data can be calculated by further multiplying the number of total transaction by the cost of networking and storage, and again by the number of weeks of retention required. For a large scale system, this cost can be prohibitive.


Digital transformation in healthcare remains complex and challenging

Digital transformation
Whilst progress has been made in digital healthcare, it hasn’t necessarily been transformational and in many cases is a simple conversion of analogue to electronic. Certainly the areas of eReferral, ePrescribing and eHealth Records haven’t undergone revolutionary change, they’re simply the transference of what were analogue forms and processes into electronic versions of the same. In healthcare transformation so many processes remain ripe for digital disruption. We’re heading into the post-digital era where healthcare organisations will need to adopt new and emerging technology. These new technologies will drive change in an environment where the sector already has a multitude of existing digital tools. The new technology that is already appearing in healthcare includes artificial intelligence, distributed ledger technology, extended reality and quantum computing. Most industries that have undergone digital transformation have done so by adopting a data-driven approach. In healthcare we’re entering an era where data will be generated at scale.


What You Need to Know about Modern Identity Security

What You Need to Know about Modern Identity Security
At its core, modern identity and access management platforms must handle provisioning, deprovisioning, and modifying user access from a central network location. Provisioning refers to giving initial permissions to an employee when they first enter your workforce. Deprovisioning, in turn, refers to removing all of the permissions from an employee’s account when they leave your employ. An IAM solution should also help you evaluate and adjust through role management the permissions your employees have as they change roles and position during their employ with your enterprise. You should consider all three of these capabilities absolutely necessary for your enterprise. Limiting the permissions individual users possess often proves the best way to prevent a security threat from taking hold; it prevents the damage a stolen password can do and limits the likelihood of an insider threat. A modern identity security threat should also allow your IT security team to mandate a certain level of password complexity.



Quote for the day:


"Trust is the lubrication that makes it possible for organizations to work." -- Warren G. Bennis


Daily Tech Digest - February 18, 2019

When It’s Okay to Fail

When It's Okay to Fail
To bring this concept to life, let’s use a real-world case. A good example is Netflix. In 2011, the company began experimenting with Simian Army, a “safe to fail” suite of tools that now reside within the Amazon Web Services suite of products and is a key tool in devops installations. Building the Simian Army was critical to Netflix’s ability to operate in a cloud-based environment fraught with potential interruptions while continuing to deliver reliable products and services. Some of the components of the Simian Army include Chaos Monkey, which tests for random failures; Latency Monkey, which introduces artificial delays and tests for the system’s return to normal runtime; and Conformity Monkey, which finds non-conforming instances and shuts them down even if they appear to perform well initially. Finally, Chaos Gorilla, at the top of the Simian Army hierarchy, simulates an outage of an entire Amazon availability zone to see how the Netflix system will handle it.


Is Blockchain Technology Overhyped?

What makes the design of blockchain so clever, De Filippi and Wright note, is its solution to an obvious concern with the description above: Why trust the network? After all, the network is open to anyone and can be joined pseudonymously. What is stopping someone from creating hundreds of Bitcoin accounts and voting to impose a bogus “consensus” on the state of the database? The technology solves this problem by making the task of adding a block to the chain into a competition: Any Bitcoin user who is interested can try to solve, by brute trial and error (and with the help of a high-powered computer), a mathematical puzzle generated by the Bitcoin software. If you find the solution, you broadcast it to the rest of the network; if a majority of the network agrees you’ve solved the puzzle (it’s simple to confirm), you receive a payment (in the form of newly minted or “mined” Bitcoin) and the block is added to the chain.

When decision makers view the future as abstract, decisions involving saving or learning that could be beneficial in the longer term, for example, are put off. This short-term perspective becomes a real problem when people need to learn new skills and adapt their behaviors. HR leaders should take an elevated view of time and think in the longer term. This helps people consider both now and later, making the future less abstract and pulling potential opportunities into the present. Becoming a broker of time means viewing time as an asset and thinking about how HR can help its people reap the benefits of effective investment in it. This view will also help HR leaders support their CEO in building a narrative on the future of work. There is no doubt that corporate leaders are often under immense pressure, so the focus on the short term outweighs the long term. But employees want and need a long-term perspective about their work, and it is the role of leaders to build this narrative.



WCF for the Real World, Not Hello World

WCF provides inherent supports for such separations. For some reasons, those templates coming with VS hide and limit such power of WCF, though they are good enough to deliver your first WCF solution ASAP, and get applause from your boss. However, if you want to deliver elegant solutions for complicated problems in efficient way, you had better follow the SOLID OOD principles and separate concerns as much as possible. Otherwise, you will have to work harder, rather than smarter, when the project needs to evolve and resolve more complicated problems. In a typical enterprise application, classes of different life cycles had better stay in different packages, or in different Visual Studio solutions, or even in different revision control repositories. Planning for such arrangement is essential for improving maintainability and flexibility in order to lower the costs and improve productivity and quality. In addition, the build time could be reduced.


Everything you need to know about the Chief Information Officer explained

istock-514069268.jpg
When something technical goes wrong at the weekend, it is likely to be the CIO who gets the call, even if someone else ultimately has the duty of fixing the problem. While the rest of the business talks about the importance of game-changing digital transformation, most CIOs recognise they will be judged first and foremost on their ability to do the basics right. A grand e-business strategy will fail if the CIO neglects to ensure that cybersecurity or business continuity is taken seriously, for example. As a C-level executive, the CIO is responsible for setting the IT strategy and ensuring that this works with the broader business strategy. In many digital businesses the IT strategy will be the main element driving the business strategy. This means the CIO needs be able to understand the broader business requirements and which to prioritise through the use of technology. Another big role for the CIO is building and maintaining an effective and motivated team.


Overcoming RESTlessness

So how can REST's value evolve in this new paradigm? There are an increasing number of organizations adopting an "API First" approach to software development; that is, emphasizing the importance of designing the machine interfaces in their applications and services to the same extent as UI's, and using those APIs to decouple the development efforts of teams responsible for different domains. OpenAPI often plays an important role in this methodology, as the implementation-agnostic interface specification. In accordance with the post-Web paradigm, this benefits the various people involved in building or modifying the software system. There is already a project underway -- AsyncAPI from Fran Mendez -- that aims to bring this same value to event-based interactions. Along the same lines, Mike Amundsen and Leonard Richardson introduced the ALPS specification to capture the semantics of network-based application interactions. Efforts like these help to address the design-time challenges of building distributed systems.


CISOs under increasing pressure, study shows


“It’s no surprise that CISOs are facing burnout. Many lack support from within their organisations, and senior business leaders need to face the facts: the threats are real, and CISOs need to be given the resources and support to tackle them. If not, the board must face the consequences. “The risk is not only personal to a CISO, but to a business’s hard-won reputation. The growing economic cost is also a worrying trend. A recent report put the cost of global cyber crime at $600bn in 2017. With that cost likely to rise in the future, we must all work harder, and cooperatively, to mitigate potential losses by having the right strategy, tools and resource in place to prevent breaches in the first place.” Dimitrios Tsivrikos, a business psychologist and lecturer at University College London, said is it of “paramount importance” to address organisational stress. “Extra emphasis ought to be paid to CISOs,” he said.



Is Cybersecurity the Same as Data Privacy?

Cybersecurity is a set of strategies, techniques, and controls to reduce risk and ensure that your data assets are protected. If data privacy is about control, then cybersecurity has the means to add, some, but not all, of the aspects of that control. Cybersecurity is at the heart of the discipline of data protection. Protection of assets in all the forms they take. Like privacy, cybersecurity is a process. It requires an understanding of the threat landscape to create policies, processes and procedures and then put the tenets of those efforts into practical application. Every aspect of our working and personal lives is touched by cybercrime. And, the cybersecurity space is a buoyant place, reflecting this. The cybersecurity industry is expected to grow over 10% annually to be worth $248 billion by 2023. Cybersecurity covers a wide-gamut of ways to protect our organizations and ourselves from cyber-attacks, whether they be from inside our company or from external threats. Cybersecurity is not just about the protection of data.


Continuous Integration Process: How to Improve Software Quality and Reduce Risk

Continuous Integration process
In the Continuous Integration process, we had talked about running a second build pertaining to the mainline code. This build happens on an integration machine. You might wonder why? As the testers, we encounter situations where bugs are seen only on a particular environment and not in another. This is exactly the reason a mainline build is run on an integration machine. Sometimes integration machine will spring up surprises that didn’t exist in a developer’s local system. Human errors such as not synching your code with the mainline will also show up here. Therefore only once it builds successfully here, the commit can be declared As the testers, we’re so familiar with environment-related defects. Production systems have their own configurations in terms of database levels, Operating system, OS patches, libraries, networking, storage, etc. It is a good practice to have the test environment as close as possible to prod system, if not an exact replica. Any major discrepancies and risks can be easily identified this way before it actually hits production systems.


Why blockchain has caught the fancy of IIT-B

Indian Institute of Technology, Bombay (IIT-B), signed an agreement with US-based Ripple Labs Inc., to create a centre of excellence to support academic research, technical development and innovation in blockchain, cryptocurrency and digital payments. IIT-B, thus, became one of the 17 universities across the world to benefit from the $50-million Ripple fund for its global University Blockchain Research Initiative (UBRI). “The idea is to create the next generation of students and entrepreneurs," says Navin Gupta, managing director, South Asia and MENA (Middle East and North Africa), Ripple. The partnership is expected to enable IIT-B’s faculty and students with opportunities for research and technology development in blockchain and cryptocurrency, which could add value to the global blockchain ecosystem, as well as industries such as fintech, professor Devang V. Khakhar, director, IIT-B, had said after signing the agreement. While Ripple will provide a grant, strategic guidance and technical resources, IIT-B already has a centre of excellence on blockchain that aims to understand prevalent blockchain platforms 



Quote for the day:


"Uncertainty is not an indication of poor leadership; it underscores the need for leadership." -- Andy Stanley


Daily Tech Digest - February 17, 2019

Inside the “hive-mind”: how AI-powered drone “swarms” can benefit society image
The next step has primarily been enabled through the rise of fog computing. This model involves drones flown autonomously and transmitting only relevant data for analysis in real-time. Rather than waiting for the drone to land to download the data it captured, businesses are able to pre-program drones with specific flight plans, allowing them to fly autonomously. As they fly, their IoT-enabled sensors feed data to the fog node on board for processing and analysis and only the exceptions or alerts are transmitted to the cloud. Thus, drones can be now used in applications where time is of the essence, and real-time insights are critical. For example, in the event of a flood, first responders could send a drone into a flood zone to look for stranded survivors. Each drone would be pre-programmed to fly over their designated patch of the flooded area, reporting back sightings of stranded people or animals in real-time. The data can then be “stitched” together and analysed at a central location so emergency response teams can create the optimal evacuation plan.


How to Avoid Failing at Mobile Test Automation

Management and some developers (especially backend ones) think that by having E2E UI tests, running against the real environment in all real-life situations can be covered. In addition, they think that those tests will cover the absence of API tests, backend, and client integration tests, which is wrong. There are so many things that cannot be tested on mobile because of platform limitations. A simple example would be a deep linking from/to external apps and push notifications. People also tend to forget that there are too many layers between the backend and application’s UI where it all can go wrong and there are no frameworks I know of that can give detailed info of where exactly the problem was: 3rd party, backend, network, network implementation in the app, UI, you name it. As a result, projects ends up with unmaintainable tests and disappointment in test automation.


JP Morgan creates first US bank-backed crypto-currency

A laptop screen showing secure payments
Not everyone is convinced that JP Morgan needed to create its own digital currency. A blockchain is designed to be decentralised, so no one party has control over transactions being sent over the network. This is the opposite of the JPM Coin concept. "It doesn't even need a blockchain at all because JP Morgan runs it. They could do it on a website and database they run," David Gerard, author of Attack of the 50 Foot Blockchain: Bitcoin, Blockchain, Ethereum & Smart Contracts, told the BBC. "It isn't like Bitcoin that aren't under anybody's control - it's a centrally controlled thing that sounds vaguely like crypto-currency." JP Morgan says that it is trialling crypto-currency and blockchain in order to speed up payment transfers, as well as reducing clients' counterparty and settlement risk, and decreasing capital requirements. However, Mr Gerard is sceptical and does not believe that the bank needs the technology to speed up transactions.


Multi-Team Backlog Refinement

Multi-team PBR is when all members of all teams refine PBIs together without yet deciding which team will implement which item. Below you can find some benefits that multi-team refinement can give you: Adaptability at the product level. Why? Because all teams understand all PBIs on the Product Backlog, instead of each team understanding only their PBIs, a subset of the Product Backlog. If all teams understand all PBIs then the PO can put any PBIs she seems most valuable at the top without being constraint to the PBIs a particular team understands; Improved self-coordination so the teams maintain a broad understanding of the whole product and the upcoming PBIs, and therefore, are more likely to know of "dependencies" between PBIs; and A transparent measure of progress at the product level for all teams to participate in estimating all PBIs so there is one common velocity at the product level, instead of a distinct velocity per team that needs to be combined into a total.


AI bias: It is the responsibility of humans to ensure fairness

AI bias: It is the responsibility of humans to ensure fairness image
Fortunately, many organisations already recognise that AI bias can occur, and are taking active remedial measures to avoid it. Microsoft’s FATE, for example, aims to address the need for transparency, accountability and fairness in AI and machine learning systems, while IBM’s AI Fairness 360, AIF360 is an open source toolkit of metrics that can be used to check for unwanted bias in datasets and machine learning models, accompanied by algorithms to mitigate such bias. Elsewhere, Google’s has a bias-detecting ‘What If’ tool in the web dashboard for its TensorFlow machine learning framework. Both Microsoft and IBM mention fairness, and the distinction between fairness and discrimination is an important one. Discrimination is the action taken based on bias, whereas fairness is a lack of bias. Amazon’s AI-based recruitment system made a biased prediction which may have been unfair but, as per Amazon’s claim that the tool was not used to inform their decisions, there was no discrimination.


Can Artificial Intelligence take away your job? Probably not

When it comes to bias, an ML model will always operate the way you've trained it, said Olivier Klein, Head of Emerging Technologies, Asia-Pacific at Amazon Web Services (AWS), which is retail giant Amazon's Cloud arm. "If you train a model with a bias, you would end up with a biased model. You continuously need to train and re-train your ML model and the most important thing is that you need some form of feedback from the end-consumers," Klein told IANS. "ML is absolutely not about replacing humans but enhancing the experiences," he added. ... Klein said that humans are really good at learning quickly with very little information. "ML models are the opposite. They require a lot of data inputs to be able to be trained. "I would argue that you show someone a bicycle a few times and you show them how to ride a bicycle and the human being is able to ride that bicycle pretty easily. To just train a robot to ride a bicycle takes millions of hours of training," explained Klein.


The Wall Street Journal is helping SMBs shore up cybersecurity

istock-674228128-1.jpg
The Wall Street Journal has created WSJ Pro Cybersecurity, a program specifically designed for small businesses. WSJ Pro Cybersecurity provides information about cybersecurity using a business lens. For $25 plus tax a month, membership includes a daily newsletter, panel discussions, interviews, webinars, and white papers. There is currently a two-week free trial for those who are interested in learning what kind of information is being disseminated. ... Besides the newsletter, WSJ will begin offering the WSJ Pro Cybersecurity Small Business Academy, a two-day conference on cybersecurity and how it relates to smaller companies. "For large companies, cyberattacks can be costly and can jeopardize customer relationships, but for smaller businesses cyberattacks can be a fatal blow," from this WSJ press release. "Smaller businesses can also lack the resources of large companies and are often faced with a confusing array of vendors offering services."


Five emerging cybersecurity threats you should take very seriously in 2019

Firms must work to bridge the gap between communicating the technical aspects of cybersecurity and the business outcomes, such as customer satisfaction, financial health, and reputation, Olyaei said. Keeping track of new threats and not just established ones like ransomware is key for a strong security posture, said Josh Zelonis, senior analyst at Forrester.  "Whenever we develop our strategies for how we're going to protect our organizations, it's really easy to look at things that you're familiar with, or that you have a good understanding of," Zelonis said. "But if you're not looking ahead, you're building for the problems that already exist, and not setting yourself up for long-term success. And that is really the number one reason why you need to be looking ahead -- to understand how attack techniques are evolving." Here are five emerging cybersecurity threats that business, technology, and security leaders need to take seriously this year.


Why hybrid cloud architecture means totally rethinking security


But no matter why or how, from an enterprise or business perspective, it’s significantly detrimental to business, whether it’s harm to the brand from having lost customer data, or actual financial losses or downtime. “The net of it is, we need to think about security at an enterprise level,” says Mike Wronski, principal marketing manager at Nutanix. “So who owns security? Is it the cloud provider, the enterprise, or the security team?” “The generally accepted answer, or the politically correct answer, would be that it’s everyone’s responsibility,” Ashworth says. “I believe that’s true to an extent, but with a major caveat.” Since companies aren’t democratic, but totalitarian in nature, Ashworth believes a top-down approach to security has to be the ideal scenario. Security has to be recognized as intrinsic to the fabric of IT business continuity, rather than an impediment to IT goals. If a strong culture for security exists within a company, you can be assured that security is thought of at all levels, from the end user being able to recognize spam, to good sec ops within the QA process.


Centralizing Availability, Disaster Recovery and Backup for Efficient Business Continuity

From a disaster recovery perspective, the appliance offers up to 20 CPU cores and 768 GB of RAM. Unlike many competing offerings, the appliance can host applications during a server or storage system outage. The client doesn’t need to purchase additional standby hardware with the new Arcserve solution. According to Arcserve, dozens of copies of physical and virtual systems may be spun up directly on the appliance. RAM may be expanded in eight of the 11 new models to be able to run more hosts in the event of a failure. Meanwhile, cloud disaster recovery-as-a-service may be added on for integration with offsite copies, as may high availability in the form of CDP-based failover and failback. In terms of scalability, the system can support up to 504 TB of effective capacity (assuming a 20:1 deduplication ratio). It can manage up to 6 PB of backups.



Quote for the day:


"I don't know what leadership is. You can't touch it. You can't feel it. It's not tangible. But I do know this: you recognize it when you see it." -- Bob Ehrlich


Daily Tech Digest - February 15, 2019

Bettering Threat Intelligence And Cyber Security A New Role For Blockchain?

Already there should be blockchain alarm bells ringing by looking at the above explanation. Specialists are correlating data on potential threats to protect a vast network against said cyber crimes before they happen. It sounds a lot like a distributed ledger of information, openly available. However, while Threat Intelligence is a noble pursuit, there are issues with it. Applying the data, determining the right data, the actual collection of the data, and the distribution, all needs to be addressed as the current model stands. But this is where blockchain could find a niche, it has the potential to help sure up the Threat Intelligence model that is still growing in its own right. The blockchain security itself can also help beef up cybersecurity to a point where threats are being identified and dealt with even before they hit. Threat Intelligence is an essential and applicable way to counter cyber threats, which are getting more devastating and sophisticated. The function of Threat Intelligence is to spread the word on new threats and make sure that an extensive network is prepared for the danger before infection takes place.


Skeptical of AI's future? You can blame the media


Researchers at the Reuters Institute for the Study of Journalism examined 760 articles representing eight months of reporting on AI by six mainstream U.K. news outlets, including the BBC, the Daily Mail, the Telegraph and the Guardian. They found that these discussions habitually presented AI technology in a positive light — as solutions to problems in providing health care, cheaper and more efficient transport, or better business management. They rarely discussed alternatives to the AI-based solutions or examined how effective AI approaches might be in comparison with others. This dominant framing isn’t surprising given that nearly 60 percent of the articles were pegged to industry events: a CEO’s speech, the launch of some new product or research initiative, or news about startups, buyouts or conferences. They were much less likely to quote academics and government sources, who might offer more independent points of view.



How the technology can work beyond crypto

While blockchain’s improved transparency poses a huge advantage to many businesses, it also highlights a major challenge, as many businesses lack the sufficient infrastructure to fully adopt DLT technology. As a result, only through a major overhaul of its legacy systems can a company make full use of blockchain. Even beyond the technological challenges, there is also the added factor of personal emotions getting in the way of innovative technology. The volatile market of cryptocurrency has led many to become sceptical over blockchain technology. Yet despite this mentality in some organisations, there are some prime examples of how DLT has been used effectively outside the cryptocurrency sector. While cryptocurrency may be struggling in a bear market, there is a growing list of real-world examples where the technology is being used in industries outside of digital assets. Kodak leapt into the blockchain world in January 2018, with the proposal to use the technology to manage image copyright.


Treading the path to a successful digital transformation

Treading the path to a successful digital transformation image
According to Pat Geary, chief evangelist at Blue Prism a successful digital transformation is as much about deployment and support as it is about selecting the right solution. “The barriers standing in the way are often more cultural than technical,” he explained. “When rolling out new technology, organisations don’t just need to get the C-Suite on board but the entire workforce. Management must focus on communicating to employees the clear benefits of the solution and demonstrate how it will augment their capabilities, rather than adding a piece of technology or software for its own sake. “This is about using technology to unleash the creativity and innovation of an organization’s digital savvy employees. This will be the real criteria driving success. Digital transformation can only succeed if human workers support the change. The value and support of employees should never be underestimated.”


Why Enterprise Data Requires a Hybrid Cloud Strategy

Unfortunately, many organizations don’t learn that their cloud providers aren’t responsible for their company’s data until it’s too late. Understanding the importance of data availability, however, today’s leading organizations are increasingly moving to hyrbid and multi-cloud environments. This enables them to build redundancy into their infrastructure to protect their data—and really take full ownership of it. As a result, employees can access the data they need regardless of whether AWS or any other cloud provider is available at any particular moment. These organizations, then, are able to bring products to market faster and continue to deliver superior customer experiences, while competitors that operate in single-cloud environments need to wait for that environment to be restored once an outage occurs. Before the cloud, enterprises could get away with copying their data once a day—let’s say each day at 5 p.m. But in today’s age of big data—where 2.5 quintillion bytes of data are created every day—a much more frequent backup plan is needed.


Industrial control systems face uphill security battles in 2019

industrial-sec.jpg
The largest actionable takeaway from this report is for ICS security staff to work closely with hardware and software vendors. Only 18% of vendor advisories contained errors in their risk scoring, and error rates were also lower when security researchers reported errors to vendors instead of going through an external CERT process. Security professionals who find a vulnerability in their system should report it to their vendor immediately—that gives it a much better chance of being properly addressed and patched. ... Zero day threats, the report said, aren't a significant risk to ICSes, as there are plenty of ways for intruders to penetrate a network that rely on known risks and improper security of public facing ICS networks. The report also noted an increase in commercial penetration-testing tools being turned to nefarious use by hackers. Defending against these attacks requires a "kill chain" approach that targets potential threats at each level of an attack. "Defenders can use a mix of modern threat detection strategies including indicator- or behavior-based methods, or approaches relying on modeling and configuration.


Uber Introduces AresDB: GPU-Powered, Open-Source, Real-Time Analytics Engine

AresDB architecture.
AresDB's design has the following features: column-based storage, real-time upsert, and GPU-powered query processing. Column-based storage has been implemented to enable compression for storage and query efficiency. There are two categories of stores - a Live store for recently ingested data stored in an uncompressed, unsorted format and an Archive store for mature, sorted and compressed data. Real-time upsert with primary key deduplication has been implemented to increase data accuracy and provide "near real-time data freshness" within seconds. As part of real-time ingestion, AresDB classifies records as "late" or not. Records considered as "late" are put into the archive store whereas fresh records go into the live store. A scheduled archiving process also periodically takes records from the live store (after they can be considered to be mature) and merges them into the archived store. GPU-powered query processing uses highly parallelized data processing by GPUs to provide low query latencies.


Data breaches exposed 5 billion records in 2018

intro data breach circuit board technology security
The largest breach by far was one that involved people India's national ID database, known as the Aadhaar. That incident was reported in March 2018 and exposed the national ID numbers, addresses, phone numbers, email addresses, postal codes, and photographs of almost 1.2 billion Indian citizens. Other large breaches included hackers gaining access to 383 million loyalty program records stored in Marriott's Starwood guest reservation database and to 240 million guest records from Huazhu Hotel Group. Some breaches were not the result of hackers exploiting security vulnerabilities, but of security oversights that made data openly accessible on the web. This was the case with marketing firm Exactis, which exposed the personal details of 230 million adults and 110 million business contacts due to a misconfigured database. Another common cause for breaches is fraud or social engineering, where company insiders intentionally or accidentally share data with unauthorized third parties. 


Container security tools pitch service mesh integrations


Service meshes such as Linkerd, Istio and others offer granular security management and monitoring features, but only for areas of the infrastructure where service mesh sidecar container proxies are deployed. Meanwhile, defense in depth for container security environments is a hot topic, as vulnerabilities such as this week's disclosure of a runC flaw illustrate. Enter container security tools, which offer a comprehensive view into container environments inside and outside of the purview of service mesh, as well as security management for service mesh deployments themselves. "Third-party container security tools can provide coverage for things the mesh doesn't do," said Fernando Montenegro, analyst at 451 Research. The Istio service mesh, for example, is focused primarily on application security monitoring at Layer 7, while container security tools can offer in-depth Layer 2 and 3 monitoring.


IoT 2020: Trends and Challenges

One of the chief complaints about the IoT is its confusing nature, and it’s becoming clear that the term “Internet of Things” isn’t descriptive enough for most uses. In 2019, companies and the tech press will increasingly use terminology that better explains how specific implementations are being used, and those who use IoT technology will need to learn new terminology when making decisions. Although this trend will likely lead to some confusion during much of 2019, the better precision provided will lead to a clearer landscape. ... Voice recognition has been an ascendant technology for years now, and improvements in accuracy have been astounding. Even more importantly, people are becoming more accustomed to interacting with digital assistance and other voice recognition technologies, and companies are feeling more comfortable asking their customers to speak with digital assistants. Because processing voice data only requires an internet connection and microphone, more IoT devices are likely to offer voice assistance in 2019.



Quote for the day:


"After climbing a great hill, one only finds that there are many more hills to climb." -- Nelson Mandela


Daily Tech Digest - February 14, 2019

Stumbling with your public cloud deployments? An industry analyst offers advice.

advicecloud
Yuen went on to say lines of business (LOB) or other groups are not working with core IT as they deploy to the public cloud; therefore, they are not getting all of the advantages they can. “You want to maximize the capabilities and minimize the inconvenience and cost. Planning is absolutely critical for that -- and it involves core IT,” says Yuen. To ensure the best results possible, you should involve key players in the organization. For example, the organization’s procurement experts should be consulted to ensure you get the best deal for your money. Budgeting is also important. “Companies very quickly realize that they don’t have variable budgets,” continues Yuen. “They need to think about how they use cloud and the consumption cost for an entire year. You can’t just go about your work and then find that you are out of budget when you get to the second half of the fiscal year.” The beauty of an as-a-service model is you only pay for what you use. The risk is you have a virtually unlimited capacity to spend money.


Using the SharePoint Framework with Teams to build simple custom apps

SPFx goes further than integrating your Teams applications with SharePoint, building on the growing Microsoft Graph APIs to integrate with Office 365 and Microsoft 365. Like older SharePoint development technologies, you can use web parts from Microsoft, third parties, a growing set of open-source community components, or custom components developed by your own SharePoint development team. More complex code can be written in TypeScript, Microsoft's JavaScript-based language, which gives you strong typing and tools that make it easier to build and manage large-scale web applications. Using TypeScript will make it easier for server-side developers who've been using C# to build SharePoint applications to transition to client-side in-browser code, as TypeScript builds on many familiar C# development concepts. Using SPFx 1.7 you can now also use web parts to handle data connections, so you can extract information from a page and send it back to a server, and link different web parts in the same page.


Gartner: debunking five artificial intelligence misconceptions

Gartner: debunking five artificial intelligence misconceptions image
What is artificial intelligence? Well, these days, the answer depends on who you ask. For some, it means computers have finally achieved, just like us, general intelligence; what Ray Kurzweil would call the “singularity“. For others, it’s merely a conglomeration of existing tools; it’s machine learning, natural language processing, deep learning and so on. But, with AI technology making its way into the real world of business, it is crucial that business and IT leaders fully understand how AI can create value for their business and where its limitations lie. As Alexander Linden, research vice president at Gartner, said: “AI technologies can only deliver value if they are part of the organisation’s strategy and used in the right way.” ... human intervention is always required to develop AI-based machines or systems. The involvement may come from experienced human data scientists who are executing tasks such as framing the problem, preparing the data, determining appropriate datasets


The Short Life of Enlightened Leadership (and How to Extend It)


The history of socially responsible companies shows that when virtuous programs and policies exist primarily because an individual leader cares about them, his or her successors have no problem removing them. These practices are far more likely to last when they are institutionalized in rules of governance. Thus, a few enlightened capitalists have attempted, in one form or another, to institutionalize their practices in an organizational structure. Sometimes this involves a family business structure; sometimes, as with England’s 174-year-old Economist magazine, it involves a board of independent trustees charged with safeguarding its corporate and editorial independence. It can also rely on an independent trust or foundation that owns most of the company stock. Among the companies that have tried this solution are the Encyclopedia Britannica; the tea company Camellia; and some of Continental Europe’s largest corporations, including IKEA, Heineken, and Bertelsmann.


Ransomware warning: That romantic message may hide a nasty surprise


Subject lines used in this GandCrab campaign all relate to romance. Examples include 'This is my love letter to you', 'Wrote my thoughts down about you', 'My letter just for you' and 'Felt in love with you'. The body of the email only contains a * symbol and comes with an attachment - a zip file containing a JavaScript file. The file name follows the same pattern in every malicious email - 'Love_You_2018_' followed by seven or eight random digits. If the user chooses to extract and execute the JavaScript file, it'll download and excute GandCrab ransomware form a malicious URL embedded in the script. Before the ransom note is presented to the victim, they're asked to select a language to see it in – English, Korean or Chinese, something which researchers suggest indicates the main targets of those behind GandCrab. After this, the user is directed to a ransom note explaining that their computer has been encrypted and that they need to pay a ransom in Bitcoin or DASH cryptocurrency in order to get them back.


Continuous Delivery - It’s Not All About Tech!

Changing habits is hard. A habit is something you automatically do so, you have to work at stopping doing the old habit and replacing it with a new one. Publishing the data that highlighted the problems in our release cycle helped create an acceptance of the problems, plus a will to fix them.  We tried a few things to help us form good habits. For example, we had hours of delay due to the people involved in the release process sending emails as the primary communication method. An email would get sent, a message was communicated, job is done. However, until the recipient has read and understood the email, you haven’t communicated anything. If they are in a meeting, or only check their emails once or twice a day (a good habit!), then it could be hours before they see it. To quote a friend, Rob Lambert (@Rob_Lambert), “communication is in the ear of the listener”.


That VPN may not be as secure as you think

VPN, security, network security, internet security
That’s more serious than unintended leaks, the team explains — users trust providers not to snoop. The point of a VPN is to be private and not get monitored. VPN use ranges from companies protecting commercial secrets on public Wi-Fi to dissidents. Some botches are actually “defeating the purpose of using a VPN and leaving the user’s online activity exposed to outside spies and observers,” the researchers say. Other problems the team discovered include that some VPNs allegedly lie about the server locations. “We found some VPNs that claim to have large numbers of diverse Internet connections really only have a few servers clustered in a couple of countries,” the researchers wrote. They say they found at least six VPNs faking routings through certain countries when they were actually going through others. That possibly creates potential legal issues for the user, depending on local laws. Other trouble areas included privacy policies. Fifty of the 200 VPN providers that were tested had no privacy policies published on their websites at all, the group says.


Impending takeover of Ultimate Software leaves its RPA solution for HR undiminished

RPA for HR: Impending takeover of Ultimate Software leaves its RPA solution for HR undiminished image
Before we go any further, we must deal with the elephant in the data centre: Ultimate Software is subject to an $11 billion bid from a group of investors led by private equity outfit Hellman & Friedman. In a statement, the company said: “Our customers will benefit from our ability to bring new features and services to market more quickly, while still enjoying the same high level of service they have with Ultimate today, or better, with new innovations to our offerings.” Or to put it another way, whatever happens next — and by the way, the agreement the company has with the prospective investors allows it a 50-day ‘go-shop’ period, to look for alternative deals — it’s in no ‘would be’ purchaser’s interest to limit the product offering. The bid represents a 19% premium on the share price before the offer. And the company says that post purchase, its existing management under the leadership of Chief Executive Scott Scherr, will still be at the helm.


A Look Back at 2018 and What’s to Come in 2019

On the topic of certification, one of the significant changes we made during 2018 was actually in our Professional Certification Programs for Open Certified Architects (Open CA) and Open Certified Technical Specialists (Open CTS). Using The Open Group Open Badges digital credentials program, individuals seeking professional certification as either a Certified Architect or Certified Technical Specialist can now achieve their certification in stages by working toward a requisite number of badges required for each program. We’re very proud of the work that has gone into creating milestone badges for these certifications—and we are looking forward to the launch of a new professional standard for Data Scientists early in 2019, as well. Open Badges for certification to many of our standards, such as TOGAF, ArchiMate, IT4IT, and Open FAIR are also now available.


OIG identifies risks related to NIH’s sharing of sensitive data

The agency also agreed with OIG’s recommendations to ensure security policies keep current with emerging threats and to make training and security plans a requirement. Nonetheless, NIH disagreed with the OIG’s call for additional controls to ensure training and security plan requirements have been fulfilled. In addition, the agency also informed auditors that it recently established a working group to address and mitigate risk to intellectual property, as well as to protect the integrity of the peer-review process. “We maintain that our findings and recommendations are valid,” concluded OIG. “We recognize that NIH reported that it is already taking certain actions, such as the working group that was recently established, that may address our recommendations. We also provided NIH with other potential actions to address our findings and recommendations. If NIH determines that it does not need to strengthen its controls, it should document that determination consistent with applicable federal regulations and guidance.”



Quote for the day:


"A company is like a ship. Everyone ought to be prepared to take the helm." -- Morris Wilks