Daily Tech Digest - December 03, 2017

IT jobs in 2020: Preparing for the next industrial revolution

it-jobs-2020-intro.jpg
As IT evolves in the direction of more cloud adoption, more automation, and more artificial intelligence (AI), machine learning (ML) and analytics, it's clear that the IT jobs landscape will change too. For example, tomorrow's CIO is likely to become more of a broker and orchestrator of cloud services, juggling the strategic concerns of the C-suite with more tactical demands from business units, and less of an overseer of enterprise applications in on-premises data centres. Meanwhile, IT staff are likely to spend more time in DevOps teams, integrating multiple cloud services and residual on-premises applications, and enforcing cyber-security, and less time tending racks of servers running siloed client-server apps, or deploying and supporting endpoint devices. Of course, some traditional IT jobs and tasks will remain, because revolutions don't happen overnight and there will be good reasons for keeping some workloads running in on-premises data centres. But there's no doubt which way the IT wind is blowing, across businesses of all sizes.


Why 2018 Will be The Year of AI


Everything is presently becoming connected with our devices, such as being able to start a project on your Desktop PC and then able to finish your work on a connected smartphone or tablet. Ray Kurzweil believes that eventually, humans will be able to use sensors that connects our brains to the cloud. Since the internet originally connected to computers and has advanced to connecting to our mobile devices, sensors that already enable buildings, homes and even our clothes to be linked to the internet can in our near future expand to be used to connect our minds into the cloud. Another component for AI becoming more advanced is due to computing becoming freer to use. Previously, it would be costly for new chips to come out during an eighteen-month-period at twice the speed; however, Marc Andreessen claims that new chips are being processed at the same speed but only at half of the cost.


Artificial intelligence isn’t as clever as we think, but that doesn’t stop it being a threat


Even though our most advanced AI systems are dumber than a rat (so says Facebook’s head of AI, Yann LeCun), it won’t stop them from having a huge effect on our lives — especially in the world of work. Earlier this week, a study published by consultancy firm McKinsey suggested that as many as 800 million jobs around the world could be under threat from automation in the next 12 years. But, the study’s authors clarify that only 6 percent of the most rote and repetitive jobs are at danger of being automated entirely. ... If a computer can do one-third of your job, what happens next? Do you get trained to take on new tasks, or does your boss fire you, or some of your colleagues? What if you just get a pay cut instead? Do you have the money to retrain, or will you be forced to take the hit in living standards?


Fintech is facing a short, game-changing window of opportunity

Fintech
Regulation was a major competitive barrier that helped fintech startups. Now that those regulations are (nearly) out the door, big banks will enter lucrative markets more aggressively. Customer acquisition costs will rise because there will be more bidders for the same consumers, plus large financial institutions have deeper pockets and an ability to amortize costs across multiple business lines. This will make it much harder for startups, no matter how innovative, to compete head-on without the CFPB’s regulatory cover. Partnerships provide a great path to success given the scale, brand recognition and resources that financial institutions bring to the table. Of course, partnering with big banks can be risky. Sales cycles are long, bureaucracy is high, and the risk looms large that banks could ultimately take these innovations in-house. Even if their products are only 50 percent as good, the well-resourced, branded players will almost always win.


Here’s How Blockchain can Make Spam and DDoS Attacks a Thing of the Past

blockchain fight spam ddos attacks
A startup called Cryptamail is launching an email network that has no centralized servers. By using a pure blockchain approach to sending and receiving emails, the system is not susceptible to attacks on any single node on the blockchain. This also means that messages cannot be read without the private key – which makes Cryptamail more secure than regular email, wherein there is a risk of compromise if an attacker is able to access the email servers somehow. One key benefit here is that through blockchains, nodes get to earn tokens for participating in the network – this is how Bitcoin miners have been getting their reward, after all. And with tokenization, there is some cost for sending messages across, which can discourage spammers. Meanwhile, users can get compensated fairly for receiving legitimate marketing material that they may like.


In the future of education, AI must go hand-in-hand with Emotional Intelligence

Emotional Intelligence
People often tend to overlook the fact that technology is only as good as its creator. Unless there is a tutor to inspire, help students through their struggles, adopt teaching methods that are suited to the student’s needs and abilities, students won’t be able to succeed in their career paths. Technology can only lend a helping hand to learn, but it cannot replace the immense knowledge and experience that comes with a teacher. Therefore, to achieve good scores or results, a blend of both traditional learning along with digital tools is necessary to create an environment for learners that is informative, immersive, interesting and flexible. Hello Emotional intelligence! While Artificial Intelligence powered chatbots can deliver the best of content to many students simultaneously by analyzing a large amount of structured and unstructured data, it completely lacks on the emotional involvement quotient.


Machine Learning with Optimus on Apache Spark

Optimus
The way most Machine Learning models work on Spark are not straightforward, and they need lots of feature engineering to work. That’s why we created the feature engineering section inside the Optimus Data Frame Transformer. Machine Learning is one of the last steps, and the goal for most Data Science WorkFlows. Some years ago the Apache Spark team created a library called MLlib where they coded great algorithms for Machine Learning. Now with the ML library we can take advantage of the Dataframe API and its optimization to create easily Machine Learning Pipelines. Even though this task is not extremely hard, is not easy. The way most Machine Learning models work on Spark are not straightforward, and they need lots of feature engineering to work. That’s why we created the feature engineering section inside the Optimus Data Frame Transformer.


The Death Of Enterprise Architecture: Defeating The DevOps, Microservices ...

Current application theory says that all responsibility for software should be pushed down to the actual DevOps-style team writing, delivering, and running the software. This leaves Enterprise Architect role in the dust, effectively killing it off. In addition to this being disquieting to Enterprise Architects out there who have steep mortgage payments and other expensive hobbies, it seems to drop out the original benefits of enterprise architecture, namely oversight of all IT-related activities to make sure things both don't go wrong (e.g., with spending, poor tech choices, problematic integration, etc.) and that things, rather, go right. Michael has spoken with several Enterprise Architecture teams over on the changing nature of how Enterprise Architecture help in a DevOps- and cloud-native-driven culture. He will share their experiences including what type of Enterprise Architecture is actually needed


The Role of an Enterprise Architect in a Lean Enterprise


Being able to describe both the current state and future state architectures is essential to bringing projects in line. Start by assessing the current portfolio. Map out what systems exist and what they do. This does not need to be deeply detailed or call out individual servers. Instead, focus on applications and products and how they relate. Multiple layers may be required. If the enterprise is big enough, break the problem down into functional areas and map them out individually. If there is an underlying architectural pattern or strategy, identify it and where it has and has not been followed. For example, if the enterprise strategy is a Service Oriented Architecture, which applications work around it and access the master data directly? Where are applications communicating via a common database? Once you have the current state mapped out, think about what you would like the architecture to look like in the future.


Digital Trust: Enterprise Architecture and the Farm Analogy
Digital Tust: Data Farm
In describing best practices for handling data, let’s imagine data as an asset on a farm. The typical farm’s wide span makes constant surveillance impossible, similar in principle to data security. With a farm, you can’t just put a fence around the perimeter and then leave it alone. The same is true of data because you need a security approach that makes dealing with volume and variety easier. On a farm, that means separating crops and different types of animals. For data, segregation serves to stop those without permissions from accessing sensitive information. And as with a farm and its seeds, livestock and other assets, data doesn’t just come in to the farm. You also must manage what goes out. A farm has several gates allowing people, animals and equipment to pass through, pending approval. With data, gates need to make sure only the intended information filters out and that it is secure when doing so.



Quote for the day:

"Not all readers are leaders, but all leaders are readers." -- Harry S. Truman

Daily Tech Digest - December 02, 2017

Data or Algorithms – Which is More Important?


Most of you will recognize that this was also the birth of the era of Big Data, because Hadoop for the first time gave us a reasonable way to store, retrieve, and analyze anything. The addition of unstructured and semi-structured data like text, speech, image, and video created the possibilities of AI that we have today. It also let us store volumes of ordinary data like web logs or big transactional files that were previously simply too messy to store. What you may not know, and I heard Doug Cutting himself quote at this last spring’s Strata Conference in San Jose is that the addition of unstructured and semi-structured data are not the most important feature of Hadoop. The most important feature is that it allowed many ordinary computers to function as a single computer. This was the birth of Massive Parallel Processing (MPP). If it hadn’t been for MPP the hardware we have today would never have evolved and today’s data science simply would not and could not exist.



12 Tips For Using Devops To Fuel Digital Transformation

12 tips for using DevOps to fuel digital transformation
DevOps automates software assembly, leveraging continuous integration, development and deployment, to improve customer experiences, respond faster to business needs, and ensure that innovation is balanced with security and operational needs. You can think of DevOps as agile on steroids. And it is catching on. Fifty percent of 237 organizations surveyed said they are implementing DevOps, according to Forrester Research. "The DevOps momentum is occurring within all industry sectors," analyst Robert Stroud wrote in the research report. "As we are near the end of 2017, the number of inquiries are increasingly focused on how organizations will be successful given the pressure of accelerated delivery of applications and services — without additional headcount."


CTO Perspectives: The roots of OpenStack and the multi-cloud

Every technology shift that has happened, we’ve taken our customer intimacy, our customer focus, and applied it as the shifts happened. Even to this day, it has continued to serve us well. Cloud was the biggest shift. Rackspace jumped in pretty early and it was a big shift for us when we started to build our own cloud and software for the first time. We wrote some software for our ticketing system and our customer management systems. This software was part of the product that was going to deliver the cloud, and eventually became OpenStack. That legacy has served us well, because it’s now the foundation for our public cloud and also all the work that we are doing on OpenStack private cloud. Even during the cloud era, that emphasis and focus on great customer service and great customer outcomes was important – and valuable, because we created something that we call “managed cloud”.


Four ways state and local CIOs can boost cybersecurity


IT operations in state and city government are often run by the various agencies within the government, rather than being centralized under the state’s or city’s CIO. This leads to shadow IT, with a wide range of servers, software, and hardware spread across the state and city, and no standardized way to measure their risk level or even know when systems need to be updated. IT administrators cannot share best practices, causing further inefficiencies. What’s worse than shadow IT? Shadow security — rogue systems with no security features turned on. Fortunately, some states and cities have made significant efforts toward consolidating and federating their IT, and the broader trend is toward consolidation, as NASCIO reported in its survey of state CIOs.


Why cryptocurrencies are causing an international racket


Today one of the biggest challenges faced by cyber extortionists is how they obtain the cash. In the past, they would ask victims to deposit money into bank accounts or transfer funds via the likes of Western Union, all easily traceable. Fast forward to today and we have a myriad of decentralised cryptocurrencies like Moneto, Ethereum, and the most popular Bitcoin, which offer users fast transactions with full anonymity. If you're a cyber extortionist - what's not to love? This major development can be linked to the escalation in ransomware attacks across the world, being easy to use without the need for any middleman to transact. According to a study earlier this year by 2Cambridge University, there are now over 6 million people transacting with Bitcoin, the majority of which is legitimate business.


2017: A year of highs and lows for Linux and open source

Linux users had to suffer release after release, where next to nothing improved on the Ubuntu desktop front. This was a mistake of grand proportions and sent a lot of users scurrying to the likes of Linux Mint or Elementary OS. ... Samsung has opted to resuscitate Linux and convergence, with the help of their Galaxy line of smartphones and DeX. If you're unfamiliar with DeX, it's a dock that enables users to plug in a supported Galaxy device and enjoy a desktop experience, powered by their smartphone. In the midst of 2017 passing, Samsung made the announcement they were developing an app called "Linux on Galaxy," which would allow users to boot their favorite distribution (or multiple distributions) of Linux on their Galaxy S8+/Note8 devices, and take advantage of DeX—so a full-blown Linux desktop, powered by a smartphone.


Hacking the Autonomous Vehicle


"Today chip-tuning is already used to change the management of the engine and find additional horsepower. This is in most cases legal, but liberates the car manufacturer from its guarantee. When self-driving cars are a relevant market, it is a question of time, when programmers will offer software to ensure a higher safety for their owners, programmed preference for the passenger against the pedestrians.” In the same way that there are after-markets for computer chips that override the engine performance settings that come with the automobile out of the factory, will there evolve an after-market for technicians who can “hack” the life-and-death settings that are pre-programmed into an autonomous vehicle? We are already seeing situations where customers are resorting to “hacking” their vehicles. Farmers are hacking their John Deere tractor’s firmware


Is Your Technology Function Ready For A Digital World?

Most large companies are using digital technologies to do things such as launch apps, build e-commerce solutions and harness data to learn more about their customers. But these “quick hit” efforts inevitably fall short of their goals when they must contend with legacy IT systems and data that can’t interface with new digital apps and architectures. Data is scattered across the company, rendering it useless. Outmoded and sluggish IT operating models slow the company down. The companies that take digital to its full potential learn to integrate their front-end digital solutions with their aging legacy stack. They build technology functions that are flexible, fast, collaborative and creative–hardly the adjectives most companies use to describe their IT departments. But unlocking the true potential of technology can be transformative.


Lenses Are Being Reinvented, and Cameras Will Never Be the Same


In imaging lenses, chromatic aberration must be minimized—it otherwise produces the colored fringes around objects viewed through cheap toy telescopes. But in spectrographs, different colors must be brought to focus in different places. She and co can do either. Neither do these lenses suffer from spherical aberration, a common problem with ordinary lenses caused by their three-dimensional spherical shape. Metalenses do not have this problem because they are flat. Indeed, they are similar to the theoretical “ideal lenses” that undergraduate physicists study in optics courses. Of course, physicists have been able to make flat lenses, such as Fresnel lenses, for decades. But they have always been hard to make. The key advance here is that metalenses, because they can be fabricated in the same way as microchips, can be mass-produced with subwavelength surface features.


Introducing Obevo: Get Your Database SDLC Under Control


While existing open source tools could do the job for simpler cases, they could not handle the scale and complexity of some of our existing systems. And we could not just leave these existing systems without a proper SDLC; they are critical systems with active development and releases. Thus, we developed Obevo (available under the Apache 2.0 License), a tool to handle all such use cases. Obevo’s key differentiator is the ability to maintain DB objects per file (similar to how class definitions are typically stored per file) while still handling incremental deployments. In this article, we will describe the DB Deployment problem space and then demonstrate how the object-based project structure helps us elegantly manage hundreds and thousands of schema objects for a variety of object and environment types.



Quote for the day:


“Be who you are and say what you feel, because those who mind don’t matter and those who matter don’t mind.” -- Dr. Suess


Daily Tech Digest - December 01, 2017

6 quick ways to decide who should be invited to a project meeting

istock-639467826.jpg
US businesses lose around $37 billion each year due to unnecessary meetings, and the time lost and opportunity cost due to unproductive meetings can be compounded if there are unnecessary attendees. Project meetings are no different. Not every project management meeting is the same and the attendees required will vary, depending on the specific meeting. ... If 12 people average an hourly rate of $50 and attend a one-hour weekly meeting for 24 weeks, the cost to the project is $14,400. If only eight people needed to be in attendance, the cost would have been $9,600. The additional four attendees cost the project $4,800 over 24 weeks. This may not sound like a significant number to the bottom line of a project, but if this were to take place over 10 projects, the cost to a company would be almost $100,000. Importantly, this cost does not factor in the productivity lost due to the unnecessary attendance



5 key concerns when testing cyber threat intelligence effectiveness

Cyber threat intelligence (CTI) plays an important role in an organization’s defense-in-depth defense strategy often being leveraged by other cyber security functions, such as security event monitoring, incident response and forensic investigations. To derive value from CTI, raw or processed data feeds must be analyzed and applied within the context of the organization to improve, among other capabilities, the ability to detect threats and respond to incidents. Visibility into the design and operating effectiveness of CTI processes can provide some assurance to management and potentially support funding requests for further investment in this area. Based on that premise, below are five areas to consider when conducting a review of your organization’s CTI capabilities.


5 Steps to Create Winning Enterprise Integration Strategy in the Digital Era


In the fast paced technology driven businesses, quick turnaround solutions like microservices often take the precedence over “planned” integration. This strategy, although in short term, provides for better business value, in the long run, creates unmanageable technical debt. As uncontrolled business debt erodes business’s ability to grow further, uncontrolled technical debt erodes IT department’s ability to fund future innovations. Although in many cases it’s an easiest approach to take, piecemeal achievements are short lived. Soon, IT teams find themselves lost in a sea of fragmented software gizmos. Nevertheless, proving the value of a horizontal function such as Integration has become extremely challenging in the digital era.


Under proposed NATO policy, members would fight back against state-sponsored hacking

ww3
U.S. Navy Commander Michael Widmann from the NATO Cooperative Cyber Defence Centre of Excellence said that NATO members are increasingly investing in cyber warfare methods to fight off and respond to attacks from state-sponsored hackers. “There’s a change in the [NATO] mindset to accept that computers, just like aircraft and ships, have an offensive capability,” he noted. Discussing the geopolitical implications of NATO switching from a defensive to an aggressive stance on hacking and its broader implications for businesses, Adi Dar, chief executive office of Cyberbit Ltd., told SiliconANGLE that the move indicated that the age of cyberwarfare has begun. “The enemy is armed with new strategies, goals and capabilities, and we must rethink our approaches as we prepare our organizations and nations to meet these evolving challenges,” Dar said. A security vulnerability is likely to exist across multiple organizations of the same industry segment


Businesses bracing for GDPR data deletion requests


“Organisations need to balance an understanding of the data landscape in the organisation with a wider knowledge of the day-to-day practices in the business, including the possible pitfalls. For example, if businesses do not have a record of data duplication or are unaware of staff copying data, data erasure requests won’t be conducted correctly.” According to Bunker, only through working with various departments that hold and process critical data to map storage locations and data flows can organisations create the necessary understanding. “Even when the information goes outside the organisation, this data is still your responsibility, so you need to know who you’ve shared it and through which communication channels so you can effectively execute a data erasure request. Deletion can then be carried out automatically leveraging technology, or manually,” he said.


Boards Should Take Responsibility for Cybersecurity. Here’s How to Do It

Ideally, security executives should attend board meetings in the same way that a chief financial officer would. Failing that, they should at least be briefed by the board on the organization’s projects and should have a chance to respond with functional plans to support the company’s top priorities. When meeting with security leaders, directors should ask how their cybersecurity plan will help the company meet one or some of these objectives: revenue, cost, margin, customer satisfaction, employee efficiency, or strategy. While these terms are familiar to board members and business executives, security leaders may need guidance on how to frame their department’s duties in the context of business operations. ... Incorporating security in the early stages of product development results in safer, more secure offerings and can spare companies the expense, hassle, and potential public embarrassment that accompanies retrofitting security.


Amazon brings its digital assistant to the office with Alexa for Business

echo.jpg
One of the biggest value adds for Alexa for Business is its conference room support. According to Vogels, users leveraging Alexa for conferences will no longer need conference IDs and they'll simply be able to say: "Alexa, start the meeting" to get it going. Additionally, Vogels said, Alexa can be used in the conference room to dim the lights or lower the blinds, find an open conference room, or order supplies. So far, it will integrate with products from Cisco, Polycom, and a few others. Alexa for Business will help at your desk as well by making calls on your behalf or scheduling meetings. Vogels said that Alexa for Business will integrate with Office 365 and Google's G Suite, and it will also support on-premises Exchange for business users to handle calendar scheduling and other processes.


Bitcoin Is an Emerging Systemic Risk


Cryptocurrency is, admittedly, much smaller than the subprime bubble that popped a decade ago, which was roughly two orders of magnitude larger than bitcoin today. But bitcoin has shown, on several occasions, a persistent ability to defy detractors like me to grow an order of magnitude in less than 12 months; if it does so again, it will be three times larger than LTCM. LTCM on its own very nearly ruined the world in 1998. If we aren’t careful, this is the kind of market where a financial institution can get in serious trouble extremely quickly (imagine the damage a character like Nick Leeson or Kweku Adoboli could have done trading Bitcoin contracts – which are coming soon to both the CME and, reportedly, Nasdaq). We know that cryptocurrency marketing is writing checks the technology can’t cash; most of these systems are unusable as backbones for global finance. It is a matter of time before the punter on the street becomes as disillusioned as I, an irascible blockchain software entrepreneur, have become.


Wi-Fi Standards And Speeds Explained & Compared

In the world of wireless, the term Wi-Fi is synonymous with wireless access, even though the term Wi-Fi itself (and the Wi-Fi Alliance) is a group dedicated to interoperability between different wireless LAN products and technologies. The standards themselves are part of the 802.11 family of standards, courtesy of the IEEE. With terms such as “802.11b” (pronounced “Eight-O-Two-Eleven-Bee”, ignore the “dot”) and “802.11ac”, the alphabet soup of standards that began in the late 1990s continues to see improvements in throughput and range as we race to the future to get faster network access. Along the way, improvements are being made by adopting new frequencies for wireless data delivery, as well as range improvements and reduced power consumption, to help support initiatives like “The Internet of Things” and virtual reality.


Enterprise phishing attacks surge but resiliency is on the rise

screen-shot-2017-11-29-at-12-35-31.jpg
The report says: "While the capacity of each organization is different, it's important that anti-phishing programs stay as active as possible. This is particularly true when it comes to developing recognition and reporting of active threat models. As with susceptibility and reporting, resiliency is improving throughout major industries. Education is the exception. Possible reasons: tighter security budgets compared to other industries, lack of central control and typically open environments that encourage users to "bring your own device." In the first eight months of 2017, over 216,000 emails were reported as sent through phishing campaigns, 15 percent of which deemed malicious -- the rest being only spam or non-malicious messages. In total, business email compromise (BEC) accounted for five percent of reported attacks in the same time period



Quote for the day:


"When data disproves a theory, a good scientist discards the theory and a poor one discards the data." -- Will Spencer


Daily Tech Digest - November 30, 2017

5 Questions to Ask About DDoS Pricing


As DDoS attacks grow more frequent, more powerful, and more sophisticated, many organizations turn to DDoS mitigation services to protect themselves against attack. DDoS protection vendors range in all shapes and sizes, from dedicated DDoS mitigation providers to CDN vendors who add website DDoS protection, to ISPs who resell DDoS protection as an add-on. As a result, the quality and cost of such service can vary wildly, and many customers end up purchasing protection packages that are either inadequate, or too big for their needs, resulting in unnecessary costs. ... Paying for attack traffic is particularly a concern if you rely on your CDN provider, your ISP, or your public cloud host for DDoS protection, because these providers charge customers according to the amount of traffic. In such cases you will essentially be paying your provider to be attacked, which can quickly escalate to tens of thousands of dollars (or more) per attack.



Millennials could change ‘hero’ mentality in IT departments, says Gartner

“Maybe new recruits have some skills, they have some capabilities they can bring to your organisation in terms of new thinking, in terms of how they collaborate,” he said. Tiny Haynes, Gartner research director of infrastructure, said organisations can benefit from the “experience versus enthusiasm” of young recruits to create a more collaborative working environment. Entrepreneur Margaret Heffernan, also speaking at the Gartner event, said an individualistic environment in the workplace can have an adverse effect. “For decades, we used to believe the best way to manage people at work is you get the high performers and you put them all into fantastic departments and projects and you get them to compete furiously for recognition, for bonuses, for promotions,” she said. “We found that instead you create huge amounts of dysfunction, huge amounts of aggression and huge amounts of waste.”


How to ensure IT works on the right projects

IT project management, IT spending, IT priorities
The first step in improving IT’s impact on your business is to understand whether the business’s perception of IT is reality and then position your IT organization to leverage their knowledge to support the business. It’s all about developing a strong partnership with the business and empowering your IT team to do more. Once you have your team aligned, you need to paint the vision for all that IT can be. In the digital era, IT can do more than put in systems; we can develop new products and services. That’s a greater level of influence than IT has had before, and your team needs to know that. There is a journey of taking IT from a liability to an asset and then to an enabler of value capture, and finally to being a value creator itself. Your whole team needs to understand the journey and that it doesn’t happen overnight.


Why Your Board Is Critical To Digital Transformation Results

Nick-Evans-Digital-Transformation
The survey which looked across four pillars of digital transformation maturity, including strategy and vision, people and culture, process and governance, and technology and capabilities, found the majority of companies were still in the early stages of their digital maturity based on a scale of early (1.0), developing (2.0) and maturing (3.0). ... One of the most surprising findings of the survey was the correlation between board-level priority for digital transformation and the achievement of transformational results. The results when digital was a top 3 board-level priority were surprisingly higher than when digital was simply a top 3 CEO-level priority ... Board-level prioritization of digital transformation is needed to create the impetus for change and to fuel investments; perennial skills such as leadership, culture and change management are needed to promote understanding and buy-in


The importance of data mining

The importance of data mining
With the help of data warehouses, information is extracted from a wide range of systems, converted into an ordinary format and uploaded into the data warehouse. Extract, transform and load or ETL is the name given to this sophisticated process. When the obtained info is merged and converted, experts can work with the data. In the past, the consolidation of information was conducted within a specific time frame like – once a day, once a week, bi-weekly or monthly. One of the main reasons why intervals were used was the fact that databases had to be offline while the data was processed. As you are probably aware, the business that is open 24 hours a day, 7 days a week can’t be down simply because data must be updated. As a result of that, many businesses and organisations had old, obsolete and/or irrelevant data. Even with irregularly updated data, organisations in the 1990s were operating fine, but today it’s impossible to run a business in this way.


What is two-factor authentication (2FA)? How to enable it and why you should

keys authentication
The notion of 2FA as a best security practice is no longer even remotely new. Google brought the advanced form of online security into the mainstream conscience with the launch of multilayered protection for enterprise customers in 2010 and then for all Google users in 2011. Facebook followed soon after. Yet, according to a recent report by the Pew Research Center, only 10 percent of American adults can correctly identify a two-factor-enabled login screen from a set of four choices.  Another report, from Duo Labs, estimates a measly 28 percent of Americans actually use 2FA on a regular basis. More than half of those surveyed by the firm had never even heard of it.  That, to put it mildly, is troubling. "People should all be looking at 2FA, even for minor things — if they're just buying toothpaste at a shopping site," says Patrick Wardrop, chief product architect of IBM's Identity and Access Management division.


Machine Learning and Big Data Know It Wasn’t You Who Just Swiped Your Credit Card


The algorithm knows right away if your card is being used at the restaurant you go to every Saturday morning — or at a gas station two time zones away at an odd time such as 3:00 a.m. It also checks if your transaction sequence is out of the ordinary. If the card is suddenly used for cash-advance services twice on the same day when the historic data show no such use, this behavior is going to up the fraud probability score. If the transaction’s fraud score is above a certain threshold, often after a quick human review, the algorithm will communicate with the point-of-sale system and ask it to reject the transaction. Online purchases go through the same process. In this type of system, heavy human interventions are becoming a thing of the past. In fact, they could actually be in the way since the reaction time will be much longer if a human being is too heavily involved in the fraud-detection cycle.


Adversarial machine learning tops McAfee’s 2018 security forecast


Serverless apps enable greater granularity, such as faster billing for services, but they are vulnerable to attacks exploiting privilege escalation and application dependencies. They are also vulnerable to attacks on data in transit across a network, and potentially to brute-force denial of service attacks, in which the serverless architecture fails to scale and incurs expensive service disruptions. “Serverless apps that are quickly implemented or rapidly deployed can use an inappropriate privilege level, leaving the environment open to a privilege escalation attack,” said Samani. “Similarly, the speed of deployment can result in a function depending on packages pulled from external repositories that are not under the organisation’s control and have not been properly evaluated.” There are also new risks, according to Samani. “By looking at the URL, we can tell if the request is going to a serverless environment.


Wi-Fi in 2018: What will the future look like?

smart city - wireless network - internet of things edge [IoT] - edge computing
802.11ax's key technology is something called orthogonal frequency-division multiplexing, or OFDM. This is a transmission technique that, in essence, allows multiple devices to share not only the same access point, but the same Wi-Fi channel at the same time. Previous-generation Wi-Fi can’t do that, so OFDM means that 802.11ax has a substantial leg up on current technology, particularly in terms of large numbers of devices sharing the same access point. But experts don’t think that 2018 will see widespread deployment of the new standard. For one thing, it has yet to be formally ratified by the IEEE, although that won’t stop large vendors from releasing 802.11ax gear before the standard is official. But beyond standardization issues, there’s also the simple fact that enterprise Wi-Fi is only just beginning to deploy 802.11ac wave 2


Making an effective case for increasing the data security budget

A variety of factors affect the affordability issue, but the lack of availability with the cybersecurity professional community also compounds the problem. One source, the Global Information Security Workforce Study, released in June 2017, that found the cybersecurity workforce gap is on pace to hit 1.8 million in 2022, a 20 percent increase since 2015. The lack of cybersecurity funding combined with too few trained cybersecurity analysts and engineers adds up to an IT security disaster. This assumption is shared by 56 of those surveyed who said their companies are underprepared to identify and respond to a security incident, while 45 percent believed their organization suffered a breach in the past year. Educating stakeholders of the tremendous risks that exploits and breaches pose to the health and success of the business is one of the greatest challenges CIOs and CEOs face when allocating resources to meaningful and effective security initiatives.



Quote for the day:


"It is better to fail in originality than to succeed in imitation." -- Herman Melville


Daily Tech Digest - November 29, 2017

Forget robots, what's the reality behind the AI hype?

Robot sophia
With all this innovation there is no denying that many of the roles we currently do will change. In fact, according to the World Economic Forum, 65% of today's school children will do jobs that haven't yet been invented. Employers need to prepare for these changes now. While we may not know the exact nature of what the new roles will look like, or how existing roles might evolve, we do know that almost all of them will involve digital skills. A great place to start, therefore, is in raising all your employees' digital skills quotient. UK businesses and tech leaders should also look to work with the government to ensure the digital skills they need in the future are part of today's school curriculum. We are just scratching the surface when it comes to assessing the impact of AI on businesses. 


Man typing on macbook
The flaw in MacOS High Sierra - the most recent version - makes it possible to gain entry to the machine without a password, and also have access to powerful administrator rights. “We are working on a software update to address this issue,” Apple said in a statement. The bug was discovered by Turkish developer Lemi Ergin. He found that by entering the username "root", leaving the password field blank, and hitting "enter" a few times, he would be granted unrestricted access to the target machine. Mr Ergin faced criticism for apparently not following responsible disclosure guidelines typically observed by security professionals. Those guidelines instruct security experts to notify companies of flaws in their products, giving them a reasonable amount of time to fix the flaw before going public.


The Apache Software Foundation Announces Apache® Impala as a Top-Level Project

Apache Impala is deployed across a number of industries such as financial services, healthcare, and telecommunications, and is in use at companies that include Caterpillar, Cox Automotive, Jobrapido, Marketing Associates, the New York Stock Exchange, phData, and Quest Diagnostics. In addition, Impala is shipped by Cloudera, MapR, and Oracle. "Apache Impala is our interactive SQL tool of choice. Over 30 phData customers have it deployed to production," said Brock Noland, Chief Architect at phData. "Combined with Apache Kudu for real-time storage, Impala has made architecting IoT and Data Warehousing use-cases dead simple. We can deploy more production use-cases with fewer people, delivering increased value to our customers. We're excited to see Impala graduate to a top-level project and look forward to contributing to its success."


Associative memory AI aids in the battle against financial crime

Neural connections
According to Sheppard, associative memory AI technologies are best thought of as reasoning systems that combine the memory-based learning seen in humans—recognizing patterns, spotting anomalies, and detecting new features almost instantly—with data.Applications of associative memory AI in the enterprise are varied. “Our strategy is to build comprehensive decision systems for financial services, supply chain management, and manufacturing and defense. ... These systems combine what we think are the best of learning approaches, such as deep learning, traditional statistical machine learning, associative learning, and others. Our goal is to deliver a sum that is much greater than its individual parts.” Intel has developed a sharp focus on the financial services industry, with its October launch of the Intel Saffron Anti-Money Laundering (AML) Advisor. 


Amazon adds security monitoring and threat defence with GuardDuty

Announcing the new offering on Tuesday night at AWS re:Invent, Stephen Schmidt, ... explained that Amazon GuardDuty can be enabled with a single click, and has removed the complexity of operation previously required for threat detection. "Continuous security monitoring is what we all strive for, but doing this at scale, without slowing down your business, is complex and expensive," Schmidt said. "Traditionally, threat detection requires you to deploy and maintain dedicated security infrastructure, which frankly is hard to automate, doesn't scale at all, and many existing solutions were designed for on-premise environment." GuardDuty consumes multiple data streams, including several threat intelligence feeds, staying aware of IP addresses and domains flagged as malicious, while also learning to identify malicious or unauthorised behaviour in a users' AWS account.


5G and the Need for Speed

Car speedometer speed fast
5G is Plaid for cellular networking – a next-generation mobile network that promises not only ten-times the available spectrum, for ten-times the download speeds, but across ten-times the devices and with a fraction of the latency. The move from 1Gbps to 10Gbps speeds will support bandwidth-intensive applications like high-definition video and virtual reality, and near real-time connections will enable ultra-low latency applications like autonomous cars, remote surgery and specialized applications within the Internet of Things (IoT). 5G is impressive, but – spoiler alert – it isn’t entirely new. The road to 5G runs through 4G wireless infrastructure, and improvements to 4G technologies like carrier aggregation, small cells, massive multiple-input and multiple-output (MIMO) and beamforming will satisfy our need for 5G speed.


The FCC's Attack On Net Neutrality

That net neutrality didn't harm sector investment isn't really debatable. Just ask industry executives from Frontier, Comcast, Cablevision, Sprint, AT&T, Sonic and even neutrality public enemy number one, Verizon all of who are on public record telling investors the "net neutrality killed sector investment" claim simply isn't true. That this concept is a canard is also supported by public SEC filings and earnings reports, as well as the billions being spent on spectrum as these companies rush toward the fifth generation (5G) wireless networks of tomorrow. Most of the sector's dollar-per-holler economists just cherry picked specific windows of time to track CAPEX increases and declines, intentionally ignoring that many of these changes have nothing to do with net neutrality as well as numerous large scale fiber deployments .


Monetizing the Internet of Things (IoT)


While it’s true that there are lots of insights at the edge, especially in enabling a new family of operational use cases, most of the insights at the edge are very tactical. Just as there is a big difference between data and analytics, there is an even more significant difference between insights and action, which is the difference between having insights versus making insights actionable. Insights without action are…well, why bother. Insights are not valuable until they are delivered to a user that can apply those insights to make better decisions. And for the more operational and strategic use cases, those decisions happen at the core and cloud. ... Ultimately, the goal of any IoT initiative should be to couple these new sources of IoT data with advanced analytics to power the business. We can use the Big Data Business Model Maturity Index as a guide to help us to create an IoT Data Monetization Roadmap


Capital markets transformation: evolution not revolution

Fintech is no longer seen as the enemy. Capital markets organizations are now willing to collaborate more to deliver some much-needed innovation. They look at co-innovation to maximize budgets and decrease risk while leaving in-house resources working on key differentiating projects rather than custom development and on premises capabilities. Gone is the rhetoric of "us vs. them" as investment banks and asset managers take their place in a new ecosystem and reinvent themselves as technology companies. Over the past 20 years, electronification has gained importance for trading and lifecycle activities, but capital markets firms must consider the changing habits and behaviours of their clients and own personnel. They also must contend with new players benefitting from lower technology costs to develop new solutions and change current practices.


The Incredible Convergence Of Deep Learning And Genomics


Out on the road, we were having a pretty good run. Convincing people was an uphill battle, but we leveraged the least intuitive aspect of Chromputer — the ATAC-seq 2D data — to demonstrate the utility of the model. We would do so with an “AI vs human” exercise. With about 100 pixels out of 600,000 pixels “on” in this sparse input data, it is virtually impossible to guess what this image-like signal means. But check out Chromputer transforming this data into a representation that maps to an enhancer — a DNA control element that can affect gene expression from as far as a million base pairs away. Pretty impressive. With the representation in the 4th layer, I could guess this is not a CTCF, a protein responsible for the genome’s 3D architecture, because it is typically surrounded by positioned nucleosomes indicated by circular clusters in the bottom half of the image.



Quote for the day:


"Know that the thing that is easiest to do is rarely the thing that is best to do." -- Robin Sharma