Daily Tech Digest - October 03, 2017

Compliance being ignored too often at critical early-stage fintech development

FinTech Compliance
Commenting on the issue, Phil Bindley, managing director at The Bunker, said: “Prioritising compliance gives early-stage Fintechs a significant head-start in getting to market faster. To comply with the financial services sector’s strict regulations, Fintechs must use data centres that not only guarantee UK data sovereignty, but conform to the most demanding industry standards. Navigating this landscape can be particularly challenging as many Fintech businesses while heavy in technology innovation can benefit massively from service providers that are experienced in delivering technology and cyber security services in the financial services sector. That’s why it is crucial that they seek out partners with the relevant experience and expertise who can help them overcome these potential obstacles.”


Configuration management processes take down GRC challenges

The discovery information from configuration management tools can also uncover rogue equipment on the platform. Discoveries should show what assets appeared in the IT estate through shadow IT, so that operations admins can bring them under proper control. It can also flag things such as unauthorized Wi-Fi access points and other equipment that could grant malicious network access. Good configuration management processes also catalog user devices: tablets, smartphones, laptops and other computers on the network. Check the configuration of these devices as they touch the network, and grant access only if they meet a set of basic policies. For example, the device must have antivirus software installed or connect via a virtual private network.


What on Earth is ‘RegTech’? Why is it the Next Big Thing in Banking?

Like many a dysfunctional family, the key to greater harmony is communication. The UK’s Financial Conduct Authority (FCA) has launched an industry sandbox for exactly that purpose, offering a forum for continuous feedback between fintechs, incumbents, regulators – and RegTech. RegTech, or regulation technology, translates complex regulation into API code. It streamlines burdensome compliance processes to keep both risk and human resources low. And there’s an urgent need for it: startup fintech providers simply don’t have the means to hire an army of compliance officers. With new regulatory technology, they don’t have to. Innovations including machine learning, biometrics and distributed ledgers help ensure compliance with fewer resources, and the benefits are significant.


How Serverless Changes Cloud Computing

Truth-be-told, many enterprise IT shops were so happy to get out of the management of physical servers within a data center that many limitations of the existing public IaaS clouds were forgiven. However, now that we’ve lived a few years with public IaaS clouds, developers and CloudOps pros are giving a huge thumbs down to the constant monitoring of servers, provisioned or not, that’s required to support the workloads. Here are two things that are happening with traditional IaaS that contributes to the problem. First, they over provision the servers needed, and go for a “You can’t have too many resources” model. Or, second, they do not provision enough resources, and instead go for a “Make them ask for more” model. Both are the wrong approaches. While estimates vary, the provisioning of pubic IaaS cloud resources over what’s actually needed is at almost 40 percent.


What is a chief digital officer? A digital strategist and evangelist in chief

While other tech-related chief titles have a clearer path to the role, chief digital officers can come from many different backgrounds, he says. They may have technology backgrounds, data science backgrounds, marketing backgrounds, or they may come from consulting or research firms. “Sometimes it’s a good strategy person,” he says. “It depends what the organization needs.” “Often, it has to do with someone’s ability to influence others,” adds Mike Doonan, partner at executive search firm SPMB. “They’re usually coming into an old-line company that’s used to doing things one way. This is the one intangible I advise my clients to look for — you want someone who’s a visionary but also someone who understands people can’t absorb that vision all at once. ”


Comparison API for Apache Kafka

With the demand for processing large amounts of data, Apache Kafka is a standard message queue in the big data world. Apache Kafka is publish-subscribe-messaging rethought as a distributed, partitioned, replicated, commit log service, and it has a lot of convenient APIs for many languages. ... Integrating Spark Streaming and Kafka is incredibly easy. Your middleware, backend (proxy-like), or IoT devices can send millions of records per second to Kafka while it effectively handling them. Spark Streaming provides simple parallelism, 1:1 correspondence between Kafka partitions and Spark partitions, and access to offsets and metadata. Primarily, we need to set up Kafka's parameters to Spark — like a host, port, offset committing strategy, etc.


Equifax breach bigger than first reported

The impact of the breach was increased based on investigations by cyber security firm Mandiant, but Equifax said forensic investigators has not found any evidence of new or additional hacking activity or unauthorised access to new databases or tables. Equifax previously disclosed that about 400,000 consumers in the UK and 100,000 in Canada may also have been affected by the breach, but now it says it believes only 8,000 Canadians were affected. The company said the forensic investigation related to UK consumers has been completed and the resulting information is now being analysed in the UK. “Equifax is continuing discussions with regulators in the UK regarding the scope of the company’s consumer notifications as the analysis of the completed forensic investigation is completed,” it said.


Nationwide CIO readies IT workforce for 'inevitable future'

We're thinking and driving a level of automation of the work we do beyond anything we've done before. So, for infrastructure professionals, I'm asking them to drive what we do to the cloud and toward automation. I'm asking them to dramatically change how we work. It's a structure where professionals need to have skills that look more like application development professionals have -- they have to write code and treat code like an asset and watch it evolve over time. That's a different skill that we asked infrastructure people to have than in the past. It changes how people do the work and the work we ask them to do. It really requires a nimbleness and constant curiosity and willingness to continue to evolve skills. It's a different mindset than what IT demanded previously, when the skills you nurtured lasted for a long period of time.


The Value of Fog & Edge Computing

fog edge computing image
Fog colocates computing to where the data is and pushes intelligence and processing capabilities closer to where the data originates. Fog differs from Edge Computing in that fog has an association with cloud services. Data is processed and stored at a fog node and pertinent data is transmitted back. There could be multiple fog nodes between the actual sensor device and the cloud data center itself. Fog devices perform all the actions of an Edge Computing device, but are flexible in partitioning workloads between the fog nodes and cloud data centers. Fog Computing also offers the benefits of well-defined software frameworks, making the fog and cloud transparent to the user and developer.


Office 2019 is coming, and here's what you need to know

The feature set may not be revealed until mid-2018, when Microsoft releases a preview of the suite. For his part, Spataro hinted at some of what will make it into Office 2019, calling out such features as Ink replay in Word and Morph in PowerPoint, which have been available to Office 365 subscribers for one and two years, respectively. ... There's little to no chance that Office 2019 will include any groundbreaking new features. Why? Because the perpetually-licensed version of the suite is built by taking the accumulated changes since the predecessor appeared — the changes issued to Office 365 subscribers over the past several years. Microsoft will take the version of Office 2016 that Office 365 ProPlus users have in, say, the spring of 2018 — and that version of Office 2016 is different than the 2015 version of Office 2016 sold as a one-time purchase — freeze the code, and call it Office 2019.



Quote for the day:


"Always do right. This will gratify some people and astonish the rest." -- Mark Twain


Daily Tech Digest - October 02, 2017

5 IT Practices That Put Enterprises at Risk

The average enterprise generates nearly 2.7 billion actions from its security tools per month, according to a recent study from the Cloud Security Alliance (CSA). A tiny fraction of these are actual threats — less than 1 in a 100. What's more, over 31% of respondents to the CSA study admitted they ignore alerts altogether because they think so many of the alerts are false positives. Too many incoming alerts are creating a general sense of overload for anyone in IT. Cybersecurity practitioners must implement a better means of filtering, prioritizing, and correlating incidents. Executives should have a single platform for collecting data, identifying cyber attacks and tracking the resolution. This is the concept of active response — not only identifying threats, but being able to immediately respond to them as well.


4 Lesser-Known Ways Artificial Intelligence Is Changing Business Today

4 Lesser-Known Ways Artificial Intelligence Is Changing Business Today
As the field of AI continues to innovate, and machines and systems become more capable, technological solutions that used to be considered as futuristic AI, like optical character recognition, have become routine -- effectively losing their "AI" status. Other technologies yet to be conquered -- like driverless cars, and the artificial re-creation of human speech -- are still being developed as AI. Many futurists have talked about the dangerous possibility that AI machines will eventually take control of humanity and destroy the world. Even though most of these prognostications mix speculation and superstition, this school of thought has persevered -- consider the news of some successful Turing Test exercises. However, many AI researchers and scientists have refuted this stance, saying that, ultimately, AI is simply a very effective tool for processing, analyzing and comprehending massive amounts of actual human data.


It’s Time to Digitize Small Business Banking

It’s not just the US that lags behind on servicing SMB customers. Avoka’s report found that the lack of attention to the small business banking opportunity was consistent worldwide. This seems madness when SMBs account for nearly half of US and UK revenue (48%). Not to mention that American SMBs account for about one-third of total US goods trade, and are expecting to continue expanding to new markets in the coming years. The time is ripe for banks to make their processes and applications more digitally accessible for the small business banking audience. The guiding principle for banks must be to make their services easier to access, easier to understand and easier to navigate. Compare this with the current situation: cumbersome form completion, waiting in lines at physical branch locations with limited business hours


Digital transformation: Your career at a crossroads

Digital transformation: Your career at a crossroads
A central issue for many IT leaders is the simple fact that IT work is significantly different than it was when they were moving up the ranks. So too are the expectations and work methodologies of those who make use of information technology in today’s workplace. "I think the most challenging part for CIOs that did not grow up in the dotcom world is to understand the behavior patterns around information creation, consumption and distribution as well as engagement for next-generation consumers and employees,’’ says Ari Lightman, a digital media and marketing professor at Carnegie Mellon University.  IT leaders need to rethink legacy models around command and control, IT service levels, access and permissions, application vetting and testing, Lightman says.


Could Microsoft Teams replace Outlook and Yammer?

"As companies adopt Teams, they would see a significant decline in the amount of internal email," he said. "They'd probably still use a lot of external email. Messages to people outside your organization are probably best-suited for email." UC industry analyst Dave Michels agreed that Teams could replace Outlook: "Teams could easily replace Yammer, as well as Outlook, and I would not eliminate that as a future possibility," he wrote in an email. "Many of the competitive messaging apps play up the end of email. While that's an admirable goal, it's not realistic because of limited interoperability and federation. Microsoft isn't preaching the end of email, so it's surprising it's not offering a single client approach to communications."


Artificial intelligence is about the people, not the machines
“If a machine comes up with an algorithm and you don’t have a deep understanding of the appropriate cause and effect relationship, than things get very dangerous,” Dalio explained. “If the future is different from the past, you’ll probably crash.” Most data scientists today agree that it’s important to have some domain experience about the problem you’re trying to solve before you throw machine learning at it. This is important so that, say, weed plucking robots don’t get distracted with morning dew they never accounted for. Or in the case of Bridgewater, understanding is important to ensure that decisions aren’t made without an anchor to reality. It’s for this reason that Dalio believes that the future of artificial intelligence will rely on humans. In his book, he notes that the day when a computer would be able to generally outperform a human without a human’s help is far away.


Setting Digital Credos to Guide Through Digital Transformation

The digital credos are the top principles to guide through changes and digital transformation in the organization. They are not just the static rules or rigid processes to stifle innovation. Instead, they are the philosophy behind the methodologies, and they are the mindsets behind behaviors and actions. First, it is important for gaining the knowledge necessary to understand and manage complex systems. Secondly, the most challenging one is to understand how the people factor affects the system, and then managing the complex system and the people of the complex system. Digital transformation does flatten the organizational hierarchy and blurs geographical, functional, organizational, and even industrial borders, it could mean less restrictive rules or silo setting


Artificial intelligence is changing the rules of account identification


As companies start to show higher levels of intent, marketers can immediately prioritize and align sales and marketing resources to engage and convert them. On the flip side, if a target account’s intent level decreases, they can easily move the account into a nurture stream and advise sales to follow up at a later date. With an evergreen, dynamic list, marketers no longer have to worry about missing out on accounts showing interest in their company and solutions. Instead, they can be proactive and reach buyers early on in the buying cycle with relevant, engaging messages. But the key to really incorporating this type of dynamic list into your ABM strategy is automation. With AI technology, marketers can incorporate audiences showing initial signs of intent and automatically trigger advertising campaigns or deliver personalized messaging to start those relevant conversations earlier in the buying cycle.


Is CI Part Of A Basic Developer Setup?

Does the basic developer setup change when we are working on a solution with a different architecture using different development methods? Let’s consider a Ruby-based microservices solution supported by a team that uses contemporary development methods. Here, the solution is divided into multiple small code bases. Even if all of them are contained within the same project for version control purposes, there are no source-level dependencies among different components of the solution. Common code is factored out into reusable libraries. By “contemporary methods” I mean developers work in a collaborative fashion most of the time, and individual work is the exception. It also suggests a rigorous test-first approach to modifying code and a strong emphasis on test automation at all levels of abstraction.


IoT security suffers from a lack of awareness

The problems will mount as new devices emerge and they, along with the sensors and software used in conjunction with them get cheaper and last longer. “You don’t have the same ecosystem of upgrade in terms of patching, devices and operating system -- none of these things that in a computer world makes them better,” Schneier says. “When your furnace becomes part of the IoT and they say you have to replace the hardware on your furnace every two years... people are not going to do it.” Assigning fault also plays a big hand in the complex market dynamics. When a perpetrator infiltrates a network through a software vulnerability, we point to the flawed software. But with connected devices forming what is essentially a digital daisy chain, it is difficult to attribute fault.



Quote for the day:


"To be upset over what you don't have is to waste what you do have." -- Ken S. Keyes, Jr.


Daily Tech Digest - October 01, 2017

Fintech players will get a big opportunity


“In fact, it can give [fintech players] good business,” said Arun Jain, chairman and managing director, Intellect Design Arena Ltd. “Fintech companies should get better opportunities, not immediately, but in the next two-three years. The positives will be much better than compared with where we are today,” he said. Bank mergers, for one, could mean lesser number of large clients to sell to, according to him. “Mergers would reduce the number of banks in operation which, in turn, would de-congest the market with a handful of fintech players vying to sell the same solutions to limited number of banks.” As per indications emerging from the Centre, three or four banks could be merged with a stronger bank in a bid to address the issue of burgeoning non-performing assets that had been plaguing the sector for a while now.



What Are the Real-World Business Needs That AI Can Help Solve?

The greatest nightmare for financial services enterprises is any breach in policy, regulation, or security. These companies have massive investments in these areas so that breaches don't happen. AI-enabled applications can help to keep a strict regulatory oversight to ensure that all policies, regulations, and security measures are being sincerely followed while designing and delivering any financial service. AI tools can also learn and monitor users’ behavioral patterns to identify anomalies and warning signs of fraud attempts and occurrences, along with evidence necessary for fighting crimes required for convictions in the court of law. ... Fraudulent claims are widespread for insurance firms. Around one out of every ten insurance claims are found to be fraudulent. Insurance organizations spend millions to identify and detect these frauds.


Where human intelligence outperforms AI


A search engine query can quickly tell you a lot about VLC, its history, a few of the major players, and some published research in the field. But to make a business decision about whether to invest tens of millions of dollars in developing and marketing VLC products, Philips needed the experi8ence, insight, and business judgment of human experts who could assess the size and scope of the market opportunity as well as the best “white space” innovation areas for the firm. Bet-the-company decisions like that should not be left to an algorithm, said Philips’s Hinman. “AOP produced actionable intelligence that enabled us to make informed decisions regarding innovation focus, invention generation, and potential acquisitions.” To be sure, the robust AI systems now being designed and implemented do more than simply answer search queries.



How psychology is shaping better machine learning

A simple way to make bots work in your favour is to simply turn your FAQ section of your website into an interactive question and answer bot conversation your customers can engage with to quickly find a solution, Millward suggested. “You need to think about whether leveraging a bot actually adds value – it might not work on complex complains necessarily customers,” she said. “But if you can translate your FAQs into an interactive chat and the bot answers the questions your customers ask – then it could work as it gets the answer quickly to your customer.” AI is also currently working well in a customer service ‘triage’ environment, Millward said. While it might not offer all the answers to customer queries, it can direct the customer down the right channel, whether it is to a bot or a huma.


British workers would be happy to have a robot boss

 British workers would be happy to have a robot boss
With tech boffins like Bill Gates saying they want the robot bosses of the future to be taxed, it would appear most Brits also agree. The poll found that the majority of working Brits (57 per cent) believe that robot bosses should pay tax, agreeing with the statement ‘if they’re replacing the role of a person, the company owning the robot should be taxed the same.’ However, on the contrary, 43 per cent feel that robot bosses shouldn’t pay tax as ‘it would set a precedent, as other technology doesn’t get taxed like a person.’ Ed Molyneux, CEO and co-founder of FreeAgent – who provide award-winning cloud accounting software for freelancers, micro-businesses and accountants – says, ‘Although it might be many years before we see physical robots taking over the workforce, many workers are already anticipating the changes that automation will bring in the years ahead.


Is Blockchain Technology Really the Answer to Decentralized Storage?

Storing data on a Blockchain like Bitcoin would be doable, in theory. However, Bitcoin’s current blocksize limit only allows for 1MB of data to be stored every 10 minutes. Even if you remove that limit, nodes will eventually stop being able to maintain a copy of the Blockchain due to its size, resulting in a centralized and easily-disruptable network. Of course, the scalability problem hasn’t deterred developers from trying to use the Blockchain as a storage solution and a project called Archain may just have found a solution. Archain is a cryptocurrency project that wants to address online censorship by creating a decentralized archive for the internet. To do so, Archain will leverage a new Blockchain-derivative data structure, the "blockweave" which according to the whitepaper, allows the network scale to an “arbitrary size."


3 Ways Blockchain Will Transform the Internet of Things

b3 iot and edgex image
Companies can improve the security of transactions that will occur among IoT devices by establishing online reputation systems. For example, the automotive industry faces this issue with the proliferation of security vulnerabilities in connected devices, as IoT devices made from different manufacturers used with cars do not necessarily have the same security measures. Businesses and clients can verify the validity of the person or system trying to access the connected car via blockchain technology that establishes a reputation system online. A reputation system for connected devices helps to establish trust based on past transactional history. This impacts security by reducing risk, which helps to increase security. This becomes even more important as autonomous cars and connected car adoption increase to improve productivity and the on-time delivery of online orders.


9 Ways to Lead as a QA Manager

Getting the latest and greatest certifications, staying current with IoT testing and TDD/BDD, or learning a new programming language- all of these are likely to come to mind for ways to develop new skills, stand out in your department, and have a bigger impact. But what if you felt like you already had all the skills you needed? What if your whole team felt this way? When QA managers empower their teams to come to the realization of “I have all the skills I need and now it’s time to apply them,” it’s not for purpose of egomaniacal overconfidence, but for choosing to uplevel the softer skills. In addition to technical professional development, QA managers and test leads must work on their ability to create a valuable, creative team. That happens through a mix of understanding one’s own role, knowing when to step back and when to step in 


IoT Security: The EdgeX Advantage

3 Ways Blockchain Will Transform the Internet of Things
Created to build an open framework for IoT edge computing, EdgeX Foundry addresses the risks created by IoT’s complex interplay between multiple devices, connectivity protocols, applications and tools. This complexity is already fragmenting the nascent IoT market into competing segments, each promoting a different set of standards and frameworks. The resulting lack of a common framework makes it increasingly difficult and costly to develop pluggable services for capabilities such as security and management in a consistent and interoperable way. EdgeX Foundry couldn’t come at a better time, as architectural models such as fog computing emerge to bring data collection, storage and compute closer to data in devices and sensors.


Behind the glare of recent hacks, some companies actually paying homage to data protection

While ex-Equifax CEO Richard Smith recently said the thought of a hack kept him up at night, it seems his words were more a revelation that he was sleeping during the work day. Meanwhile, Jim Routh has been wide-awake during his day job as Aetna's chief information security officer. He is overseeing a new authentication system to replace passwords and providing a bright spot for a health-care industry often criticized for its inadequate security. ... And Google increased the lumens shining on its security game, according to news reports, with a forthcoming hardware-backed authentication system using cryptography to protect at-risk users such as corporate executives, politicians and others with heightened security profiles.



Quote for the day:


"Tact is the ability to make a person see lightning without letting him feel the bolt." -- Orlando A. Battista


Daily Tech Digest - September 30, 2017

Securing Applications: Why ECC and PFS Matter

Many of us are familiar with Secure Hypertext Transfer Protocol (HTTPS) that uses a cryptographic protocol commonly referred to as Transport Layer Security (TLS) to secure our communication on the Internet. In simple terms, there are two keys, one available to everyone via a certificate, called a public key and the other available to the recipient of the communication, called a private key. When you want to send encrypted communication to someone, you use the receiver’s public key to secure that communication channel. ... The benefit of securing our communication to prevent snooping of sensitive data is obvious; however, encrypting the communication has its downside – it’s computationally expensive and requires a lot of CPU processing to enable, plus encrypted communication may be used in malicious ways to send proprietary information


DNSSEC key signing key rollover: Are you ready?

“There may be multiple reasons why operators do not have the new key installed in their systems: some may not have their resolver software properly configured and a recently discovered issue in one widely used resolver program appears to not be automatically updating the key as it should, for reasons that are still being explored,” ICANN says. It could also be an awareness issue—that enough operators were not aware of the deployment process. “ICANN is on schedule to begin using the private portion [for signing domains] shortly,” Vixie says. The most challenging part of this multistep, multi-year process was overseeing the plan’s development, seeking broad review and approval, and obtaining approvals from multiple internet governance organizations to execute the plan, Vixie says.


Finally, a Driverless Car with Some Common Sense

A lack of commonsense knowledge has certainly caused some problems for autonomous driving systems. An accident involving a Tesla driving in semi-autonomous mode in Florida last year, for instance, occurred when the car’s sensors were temporarily confused as a truck crossed the highway. A human driver would have likely quickly and safely figured out what was going on. Zhao and Debbie Yu, one of his cofounders, show a clip of an accident involving a Tesla in China, in which the car drove straight into a street-cleaning truck. “The system is trained on Israel or Europe, and they don’t have this kind of truck,” Zhao says. “It’s only based on detection; it doesn’t really understand what’s going on,” he says. iSee is built on efforts to understand how humans make sense of the world, and to design machines that mimic this.


Banking on machine learning

Machine learning refers to the use of mathematical and statistical models to teach machines about new phenomena. It involves ingesting raw information in large datasets, understanding patterns and correlations and drawing inferences. While this may seem similar to how humans learn, machine learning algorithms ‘learn’ at much faster speeds with the ability to adapt from mistakes and course-correct. Needless to say, there are numerous applications of ML in any banking field that requires repetitive work, high-accuracy tasks or even informed decision-making. Take data security, which is a key concern for banks. Deep Instinct, a cyber security company that leverages deep learning for enterprise security, states that new malware often contains code that is similar to previous versions.


The business case for digital supply networks in life sciences


Unlike traditional supply chains, which are linear and siloed, digital supply networks are dynamic, interconnected systems that can more readily incorporate ecosystem partners and evolve over time. This shift from linear, sequential supply chain operations to an interconnected, open system of supply operations could lay the foundation for how life sciences companies compete in the future. Digital supply networks in life sciences can address challenges with optimal management of inventories, reliability, and visibility of products moving across the supply chain, or operations efficiencies and product yields. In view of the forces affecting life sciences—pricing pressures, the emergence of value-based and personalized medicine, and the expectations of customers and regulators—creating a life sciences digital supply network can be a logical new opportunity to deliver value.


6 ways to make sure AI creates jobs for all and not the few

Whenever I talk to people about the potential impact of artificial intelligence (AI) and robotics, it’s clear there is a lot of anxiety surrounding these developments. And no wonder: these technologies already have a huge impact on the world of work, from AI-powered algorithms that recommend optimal routes to maximize Lyft and Uber drivers’ earnings; to machine learning systems that help optimize lists of customer leads so salespeople can be more effective. We’re on the verge of tremendous transformations to work. Millions of jobs will be affected and the nature of work itself may change profoundly. We have an obligation to shape this future — the good news is that we can. It’s easier to see the jobs that will disappear than to imagine the jobs that will be created in the future but are as yet unknown.


Free ebook: Data Science with Microsoft SQL Server 2016


SQL Server 2016 was built for this new world, and to help businesses get ahead of today’s disruptions. It supports hybrid transactional/analytical processing, advanced analytics and machine learning, mobile BI, data integration, always encrypted query processing capabilities and in-memory transactions with persistence. It integrates advanced analytics into the database, providing revolutionary capabilities to build intelligent, high performance transactional applications. Imagine a core enterprise application built with a database such as SQL Server. What if you could embed intelligence, i.e. advanced analytics algorithms plus data transformations, within the database itself, to make every transaction intelligent in real time? That’s now possible for the first time with R and machine learning built into SQL Server 2016.


Cloud Computing Security: Provider & Consumer Responsibilities

The first step Cloud Service Providers take, is to secure the Data Center where they host their IT hardware for the Cloud. This is to secure the DC against unauthorized access, interference, theft, fires, floods and so on. The Data Center is also secured to ensure redundancy in essential supplies (Example power backup, Air conditioner) to minimize the possibility of service disruption. In most cases, Provider’s offer Cloud applications from ‘world-class’ data centers. The Cloud Provider ensures that their Infrastructure and the Services comply with Critical Protection Laws such as data protection laws, Payment Card Industry Data Security Standard (PCI DSS), the Health Insurance Portability and Accountability Act (HIPAA), Criminal Justice Information Services(CJIS) , the Sarbanes-Oxley Act, the Federal Information Security Management Act of 2002 (FISMA) and so on.


Want to be a better security leader? Embrace your red team

Successful business leaders understand the power of disruption as a pathway to anticipating unstated future customer needs. The concept of disruption as a force for innovation is powerful in the field of cybersecurity and often pushes business leaders to problem solve in new or unexpected ways. Proactively simulating attacks on your own organization is an excellent example.  With now-broad acceptance that attackers will get in and that compromise is expected, there are distinct advantages to being “productively paranoid.” Security leaders who are productively paranoid fully embrace the idea that the best way to play defense is to start playing offense. This doesn’t mean companies should “attack back,” but they need to understand the mindset and pathways attackers take to infiltrate organizations.


The digital workplace: 8 steps to greater agility, productivity

What is the digital workplace? It is a business strategy aimed at boosting employee engagement and agility through consumerization of the work environment, Rozwell says. Think of your one-size-fits-all-users ERP or expense management applications and imagine the opposite user experience. Your digital workplace should help individuals and teams work more productively without compromising operations. It should include computers, mobile devices and productivity and collaboration applications that are web-based and synch in real time. Such tools should, for example, mimic the ease of use of Uber and Airbnb and the social aspects of Facebook and Instagram. IBM, for one, has undertaken a massive transformation of its workplace to lure new tech talent.



Quote for the day:


"The most effective debugging tool is still careful thought, coupled with judiciously placed print statements." -- Brian Kernighan


Daily Tech Digest - September 29, 2017

10 Critical Security Skills Every IT Team Needs

As hackers become more sophisticated, and attacks more frequent, it’s no longer a matter of if your organization becomes a target, but when. That reality has forced many organizations to reassess how they address security efforts, and how best to allocate scarce resources toward mitigating the damage as quickly as possible. Here, having the right mix of security skills on board is key. “For a lot of our clients, they’re starting to realize that while they certainly want to hope for the best, they absolutely have to prepare for the worst,” says Stephen Zafarino, senior director of recruiting for IT recruiting and staffing firm Mondo. “Earlier this year, with the Chase and Home Depot breach, with the ransomware attacks on Britain’s NHS top-of-mind, everyone’s trying to figure out how to fortify defenses,” Zafarino says.


Why Data Governance Is Foundational for Data-Driven Success

Analytics governance ensures that all digital assets and activities that generate insights and information using analytics methods actually enable smarter business activities. Policies related to information relevance, security, visualization, data literacy, analytics model calibration and lifecycle management are key areas of focus. Data governance is focussed on the data building blocks. Effective data governance brings together diverse groups and departments to enable the data-driven capabilities needed to achieve success. Data governance defines accountabilities, policies and responsibilities needed to ensure that data sets are managed as true corporate assets. This implies that governed data sets are identified, described, cataloged, secured and provisioned to support all appropriate analytics and information use cases required to enable the analytics methods.


It’s hangover time for enterprise cloud computing

We’re in the hangover stage of cloud computing, with IT pros comparing their giddy expectations with the reality on the ground. What I find most interesting about the 451 Research study is that enterprises see the value of the cloud, and are willing pay more for services that meet their expectations. But the cloud technology providers aren’t meeting those expectations, particularly around customer service.  This expectation gap has a historical cause: Enterprises are accustomed to large enterprise vendors with account executives who provide a “single throat to choke.” But cloud technology providers just began to answer their phones a few years ago, so this customer service stuff is still new to them. I’m also not surprised by the frustrations around cloud migration.


Perspective on Architectural Fitness of Microservices

Domain-Driven Design (DDD) is the latest methodology available to software professionals for designing a piece of software that matches the mental model of a problem domain. In other words, Domain Driven Design advocates modeling based on the practical use cases of the actual business. In its simplest form, DDD consists of decomposing a business domain into smaller functional chunks, possibly at either the business function or business process level, so that the complexity of both a business and problem domain can be better apprehended and resolved through technology. To this effect, figure 2 illustrates how the elements of the earlier business architecture meta-model collaborate to form two business domains. Because of the many documented implementation failures of Service Oriented architecture (SOA).


Why E-waste Should be at the Forefront of a Company’s Cybersecurity Plan


Some electronic devices, such as mobile devices, computers, and other items with storage ability can store valuable information that may be accessed by unauthorized individuals during the end of life process. That may pose a real cyber-security threat if such confidential information is stumbled upon by a cybercriminal. ... The fear of having their security breached via e-waste that is not properly handled has led to the increasing concern about potential exposure to cyber-security among electronics users. Of course, that makes everybody a victim. We all use one electronic product or another, whether at home or in the office. Therefore, we are always apprehensive of losing vital information such as credit card details, social security numbers, or other confidential and sensitive information to cyber-attacks.


Google Cloud IoT Core hits public beta, offers management for millions of devices

One of the biggest new features is the ability to bring your own certificate. Users can now bring their own device key Certificate Authority (CA), and Google Cloud IoT Core will verify the key in the authentication process. According to the release, this "enables device manufacturers to provision their devices offline in bulk with their CA-issued certificate, and then register the CA certificates and the device public keys with Cloud IoT Core." While the service will continue to support the MQTT protocol, it will also now support HTTP connections as well. By doing so, the release said, it will make it easier to inject data into GCP at scale. Additionally, the release noted, the service will now feature logical device representation for use cases where a business might need to retrieve the last state of a particular IoT device.


How Your Company Can Close The Cybersecurity Skills Gap

"Looking at the other areas within your organization, you probably can... leverage some of that talent and create a rotation program, into a cyber team for three to six months," Worley said. “[Put] them with the right talent to help them, just like you would with an intern.” She said creating your own talent pools isn’t just useful to close the skills gap, it can can be extremely useful for when a crisis happens. While no one wants to hear that a crisis is a good thing, Worley said the Equifax and SEC breaches do "raise the awareness of employees, because they've not been touched by this thing. It's another thing when ... your identity may be at risk. It become very personal at that point. Maybe we now have an opportunity to have that dialogue.” Another additional area Worley said companies can help improve their cyber security gap, seems like a simple one: make sure all employees know the best security practices.



Most companies operate within the descriptive and diagnostic stages, using basic data warehousing and BI approaches to get quick views on what HAS happened. Predictive analytics is when organizations project what WILL happen … graduating from rearview mirror to human intervention combined with the automation of repetitive patterns through the application of predictive machine learning (ML) models. So why are most companies not further along the analytics progression? Frankly, most enterprises are drowning in an abundance of data types and sources - many of which contradict each other as data size and ingestion rates are also on different levels. Moreover, many organizations are not taking advantage of new technologies that can unlock and manipulate data.


Cyber Attacks Demand a New Approach to Education

First and foremost is the need for a better educated cyber workforce. More needs to be done to lay a foundation of technical literacy through STEM (science, technology, engineering and math) education. Strengthening the quality of STEM education is vital, and the effort must go beyond simply meeting benchmarks such as proficiency on standardized tests. A more holistic approach to STEM should explore the practical relationships between these disciplines and daily life, thus nourishing in the next generation a technical curiosity that begins in early childhood and spans long careers. Such an approach will ensure that innovation and adaptability become second nature in our approach to cyber technology.


When disasters strike, edge computing must kick in

When disasters strike, edge computing networks must kick in
We've seen how mobile network operators (MNO) are taking advantage of edge computing themselves. It’s used to reduce latency. Those phone companies are increasingly using local computing boxes (often inside their many buildings, left over from the days of copper-requiring phone switches, and on their towers) to store and process data rather than centralizing it. “This ability will give a huge advantage to first responders,” Georgia tech says of its idea. The team of researchers published a paper (pdf) where they describe their “fog-enabled social sensing services” API. In the paper, the researchers describe how docker-friendly fog nodes connect or relay the distributed social sensors — the smartphone-carrying civilians, in other words — to hardened routers that can perform edge data processing and be pinged locally



Quote for the day:


"When we have belief the hard work follows naturally." -- Gordon Tredgold


Daily Tech Digest - September 28, 2017

Professor Harish Bhaskaran of Oxford, who led the team, said “The development of computers that work more like the human brain has been a holy grail of scientists for decades. Via a network of neurons and synapses the brain can process and store vast amounts of information simultaneously, using only a few tons of Watts of power. Conventional computers can’t come close to this sort of performance.” Daniel C. Wright, a co-author from the Exeter team, added that “Electronic computers are relatively slow, and the faster we make them the more they consume. Conventional computers are also pretty ‘dumb,’ with none of the in-built learning and parallel processing capabilities of the human brain. We tackle both of these issues here — not only by developing not only new brain-like computer architectures, but also by working in the optical domain to leverage the huge speed and power advantages of the upcoming silicon photonics revolution.”


Before you deploy OpenStack, address cost, hybrid cloud issues

Training can become an indirect OpenStack cost. IT and developer staff may not have the requisite skill sets needed to tackle an OpenStack deployment. You may need to find more OpenStack-savvy staff to handle the job, spend the money to train up existing staff as Certified OpenStack Administrators, hire consultants to jump-start the work or some combination of these tactics. Consider the implications of OpenStack support. Organizations can certainly adopt a canned OpenStack distribution and associated support from vendors like Red Hat or Rackspace. As open source software acquired directly, however, there is no official support. If you choose to deploy OpenStack, assemble a suite of support resources to address inevitable questions or to resolve problems. Some resources are free, while other resources will incur added costs.


To combat phishing, you must change your approach

To combat phishing, you must change your approach
The threat surface is growing, and cybercriminals are becoming more sophisticated. They’re utilizing threat tactics that have made it increasingly difficult for organizations to protect themselves at scale. Cyber criminals are putting pressure on businesses by increasing the volume of these kinds of targeted attacks, dramatically outpacing even the world’s largest security teams’ ability to keep up. Visibility is sadly lacking within most of today’s organizations, and it’s unrealistic for security teams to secure something they can’t see. There’s no tool or widget that can totally fix this and make everything safe. But we can get to a point where we have the ability to construct a security program that reduces risk in a demonstrable way. We can establish metrics for where your risk profile is today.


Fintech’s future is in the back end

Fear that their money would ultimately be spent on on-premise, and therefore nonscalable, technology has been another reason investors have shied away from the opportunity. This fear arises from the tendency of institutions to want to keep a new technology “in the institution” because of security concerns. However, technology has matured enough to meet the reasonably strict security requirements banks impose on partners and vendors. Just six years ago, only 64% of global financial firms had adopted a cloud application, according to research from Temenos. But now, security has dramatically improved in cloud applications and banks are willing to adopt the technology at scale. This is evidenced in both cloud solution adoption and also the industry’s growing willingness to embrace an open banking framework.


WannaCry an example of pseudo-ransomware, says McAfee

WannaCry may have been a proof of concept, but the true propose, he said, was to cause disruption, which is consistent with what researchers are learning when going undercover as ransomware victims to ransomware support forums. “When one of our researchers asked why a particular ransom was so low, the ransomware support representative told her that those operating the ransoware had already been paid by someone to create and run the ransomware campaign to disrupt a competitor’s business,” said Samani. “The game has changed. The reality is that any organisation can hire someone to disrupt a competitor’s business operations for less than the price of a cup of coffee.” In the face of this reality, Samani said the security industry and society as a whole has to “draw a line in the sand”


The Digital Intelligence Of The World's Leading Asset Managers 2017

Where once the asset management sector was a digital desert, websites and social media channels abound. Whilst this represents genuine progress, the content and functionality within them leaves a lot to be desired in most cases. Quality search functionality is hard to find, websites resemble glorified CVs and blogs read like technical manuals. As for thought leadership, well there’s little thought and no leadership. Social media, especially Twitter and Linkedin, are swamped with relentless HR tweets and duplicate updates. It’s clear that asset managers are missing an opportunity to create content that resonates with FAIs and can build lasting two-way relationships. Over the following pages we present our findings in detail and take a closer look at the digital successes and failures within the world’s leading asset managers.


Heads in the cloud: banks inch closer to cloud take-up

On the one hand, cloud providers – such as the leader of the pack, Amazon Web Services – are likely to have security processes and technology that are at least as advanced as those of their banking clients, thanks to their technical expertise and economies of scale. On the other hand, providers can pass on a bank’s data or system management to yet another contractor, increasing security risks present in traditional outsourcing. The EU’s General Data Protection Regulation, coming into force next year, will up the ante on data security. The new rules require, among other things, that bank customers are able to request that their personal data held is deleted. One practical outcome, say lawyers, is that banks will have to clarify to cloud providers exactly how they should handle


Inside the fight for the soul of Infosys


Murthy criticized Sikka's pay and his use of private jets, and claimed that corporate governance standards had eroded during his tenure. Saying he could no longer run the company amid such criticism from a company founder, Sikka resigned as chief executive on Aug. 18 and left the board six days later. Three other directors followed him out the door, including the former chairman, R. Seshasayee. Murthy's criticisms haven't let up since Sikka's resignation. Speaking to shareholders on Aug. 29, he detailed his "concerns as a shareholder" over how the company's board members approved a severance package worth roughly 170 million rupees ($2.65 million) for former Chief Financial Officer Rajiv Bansal, who left the company in October 2015.


Should CISOs join CEOs in the C-suite?

A working partnership between the CIO and the CISO is clearly a successful formula, regardless of who reports to whom. “CISOs should report to the CEO with further exposure and responsibility to the board of directors,” says Alp Hug, founder and COO at Zenedge, a DDoS and malware protection vendor. “The time has come for boardrooms to consider cybersecurity a key requirement of every organization's core infrastructure along with a financial system, HRMS, CRM, etc., necessary to ensure the livelihood and continuity of the business.” If a board of directors says defending their organization against cyber crime and cyber warfare is a top priority, then they’ll demonstrate it by inviting their CISO into the boardroom. “Of course CISOs and equivalents will say they should report to the CEO,” says John Daniels


The ins and outs of NoSQL data modelling

Data modelling is critical to understanding data, its interrelationships, and its rules. A data model is not just documentation, because it can be forward-engineered into a physical database. In short, data modelling solves one of the biggest challenges when adopting NoSQL technology: harnessing the power and flexibility of dynamic schemas without falling in the traps that a lack of design structure can create for teams. It eases the on-boarding of NoSQL databases and legitimises the adoption in the enterprise roadmap, corporate IT architecture, and organisational data governance requirements. More specifically, it allows us to define and marry all the various contexts, ontologies, taxonomies, relationships, graphs, and models into one overarching data model.



Quote for the day:


"If you realize you aren't so wise today as you thought you were yesterday, you're wiser today." -- Olin Miller


Daiy Tech Digest - Septmber 27, 2017

Google's Pixel phone has basically been a critical success from the start. Even a year after its debut, the phone has remained a standard of comparison to which all other Android devices are held – and generally not to their benefit. That's mostly due to the Pixel's prowess in areas where other Android device-makers can't (or won't) compete. Significant as that is, though, ask an average non-tech-obsessed smartphone user what they think about the Pixel – and all you're likely to get in response is a glossy-eyed stare. Google may be positioning the Pixel as a mainstream device and even marketing it as such, to a degree, but it hasn't yet managed to break through that Samsung-scented wall and make its phone impactful in any broad and practical sense.
heart.jpg
The scan would be a type of authentication known as biometric security. Smartphone fingerprint scanners have long led the way in this market as one of the most popular methods. However, the inclusion of facial scanning in the iPhone X and some Android phones has furthered the conversation on what physical characteristics can be used to secure a computing device. In addition to being used for smartphones, the heart scan technology could be used in airport security screenings, the release said. And, for those who may be worried about potential health effects of the scans, Xu mentioned in the release that the strength of the signal "is much less than Wi-Fi," and doesn't pose a health concern. "We are living in a Wi-Fi surrounding environment every day, and the new system is as safe as those Wi-Fi devices," Xu said in the release.


Rethinking security when everyone's Social Security number is public

"The assumption that we previously held, which was that Social Security numbers and driver's license numbers are relatively private ... that's now gone," he said. "Beyond how Equifax changes credit scoring, there's a big question about how Equifax changes identity validation." This is a distinctly separate issue from fraud detection, Perret said. Bank accounts and card numbers can be shut down and reissued, but banks can't do the same for Social Security numbers and other identity factors. "On the fraud side, there's a ton of work we can do, including multifactor authentication," he said, but "the KYC requirements are pretty explicit ... so that needs to be updated." Indeed, a lot of the security practices being used today are done more out of tradition than out of effectiveness.


Another experiment with currency? RBI is looking at its own Bitcoin

The success of Bitcoin, a popular cryptocurrency, may have encouraged the central bank to consider its own cryptocurrency since it is not comfortable with this non-fiat cryptocurrency, as stated by RBI executive director Sudarshan Sen a few days ago. Despite RBI's call for caution to people against the use of virtual currencies, a domestic Bitcoin exchange said it was adding over 2,500 users a day and had reached five lakh downloads. The company, launched in 2015, said the increasing downloads highlighted the "growing acceptance of Bitcoins as one of the most popular emerging asset class."  A group of experts at RBI is examining the possibility of a fiat cryptocurrency which would become an alternative to the Indian rupee for digital transactions.  According to a media report, RBI's own Bitcoin can be named Lakshmi, after the Hindu goddess of wealth.


IT and Future Employment - Data Scientist

A tongue-in-cheek definition is, a computer scientist who knows more statistics than his or her colleagues, or a statistician who knows more computer science than his or her colleagues. Time will tell if data science will become a new discipline, or if it will remain a cross-disciplinary field between these two (and perhaps other) fields. The statistician David Donoho published a paper in 2015 with the provocative title “Fifty Years of Data Science”. He was referencing the statistician John Tukey’s call, more than 50 years ago, for statistics to expand into what we now call data science. Donoho’s paper is well worth reading. I’ll list the six subfields of data science that he identifies and make several comments about each one ... A growing demand for people trained in data science has caused the shortage of these people to balloon. Moreover, only limited opportunities to obtain such training exist today


AI is changing the skills needed of tomorrow's data scientists

AI is changing the nuts and bolts of data management, alleviating data teams from a lot of tedious, manual dirty work so that they can focus their time on creating business outcomes and allowing data scientist to work at a speed and scale that is impossible today. The data scientist of tomorrow must be prepared to work with the AI revolution, optimizing processes without losing the human ability to think creatively and apply data-driven insights to real-world problems. The next generation of data scientists will be even more necessary for helping to apply models and algorithms to problems and processes across the enterprise. For data science students, it’s not only crucial to understand the data and the technology but it’s equally as valuable to learn how to function in teams, collaborate and teach.


4 Job Skills That Can Boost Networking Salaries

The report dissects the salaries of more than 75 tech positions, including eight networking and telecommunications roles and two network-specific security roles. Among the 10 network and telecommunications roles, network architects will be paid the most in the coming year, Robert Half Technology says. ... By comparison, network architects in the 75th percentile can expect to see starting salaries of $160,750, the 50th percentile can expect $134,000, and the 25th percentile will earn $112,750, according to the guide. This is the first year that Robert Half Technology is breaking down compensation ranges by percentile in its annual salary guide. The categories are designed to help hiring managers weigh a candidate's skills, experience level, and the complexity of the role when making an offer.


Critical Network of Things: Why you must rethink your IoT security strategy

Critical Network of Things: Why you must rethink your IoT security strategy
Having made, lost and then re-made my fortune in and around the industry over the past 20-plus years, I cannot help but smile over the level of hype — and, as a certain US President would call it, “fake news” — surrounding the current world of IoT. In spite of what the media and investors would like to think, IoT is not new. I can recall building all sorts of systems including AMR (Automatic Meter Reading/ Smart Metering) networks covering whole cities, pharmaceutical storage monitoring, on-line pest/rodent trap systems, trucks and trailer tracking, foodstuff refrigeration monitoring and land subsidence monitoring, just to name a few examples. They all followed the same basic architecture as we see with today’s IoT offerings, but under the label M2M (Machine to Machine).


5 fundamental differences between Windows 10 and Linux

Although in the early years, hardware support was a serious issue and the command line was a requirement, the last five years have seen very rare occasions that I've come into a problem that couldn't be overcome. I cannot say the same thing about Windows. No matter the iteration, I've always managed to find troubling issues with the Microsoft platform. Generally speaking, those issues can be managed. The latest iteration of Windows is no exception. Coming from Windows 7, I skipped 8 and headed directly to 10. I've found going from Windows 7 to 10 akin to making the leap from GNOME 2.x to 3.x. The metaphor was quite different and took a bit of acclimation. Even though they go about it very differently, in the end, both platforms have the same goal—helping users get their work done.


The Five Steps to Building a Successful Private Cloud

Like every engineering project, setting wrong expectations and unrealistic goals will lead to poor outcome. It doesn’t have to be that way. Once you have a clear understanding of what problems you need to solve for stakeholders, you must define clear goals and requirements. For example, look at the existing pain points for your developers and how your private cloud solution will solve or mitigate those problems. Improving the developers experience ensure faster adoption and long-term success. Making the move to a private cloud require focus, perseverance, motivation, accountability, and strong communication. You must have a good understanding of your existing service costs by doing a thorough Total Cost of Ownership analysis. What does the day to day operations look like to support private infrastructure?



Quote for the day:


"If you only read the books that everyone else is reading, you can only think what everyone else is thinking." - Haruki Murakami


Daily Tech Digest - September 26, 2017


Time to embrace a security management plane in the cloud

Time to embrace a security management plane in the cloud

There’s an old saying: Change is the enemy of security. To avoid disruptive changes, many cybersecurity professionals strive for tight control of their environment, and this control extends to the management of security technologies. Experienced cybersecurity professionals often opt to install management servers and software on their networks so that management and staff “own” their technologies and can control everything they can. This type of control has long been thought of as a security best practice, so many CISOs continue to eschew an alternative model: a cloud-based security management control plane.  Given the history of cybersecurity, this behavior is certainly understandable — I control what happens on my own network, but I have almost no oversight on what takes place on Amazon Web Services (AWS), Microsoft Azure or Google Cloud Platform.


Canada's Tough New Breach Reporting Regulations

Previously in Canada, entities experiencing a breach were required to identify what kind of breach occurred and to notify regulators. "Contacting affected individuals [about the breach] would be something you would delegate to the regulators to get advice and guidance on," he says. But that all changes under the Digital Privacy Act of 2015, which amended certain Canadian privacy regulations in three key ways and will likely go into effect by the end of 2017, Ahmad says. Those changes include mandatory breach notification to affected individuals; keeping a record log for two years of any types of data breaches that occur; and imposing sanctions of up to $100,000 for each violation of the new law, he says. Those amendments provide "a bit more teeth" to Canadian data breach legal requirements, he notes.



Machine Learning Big Data

From buzzword to boardroom – what’s next for machine learning?

You only need to think of the allocation of payments to invoices, the selection of applicants in the HR area, the evaluation of marketing ROI, or forecasts of customer behaviour in e-commerce transactions. Machine learning offers great potential for companies from the big-data environment, provided they muster the necessary developer capacities to integrate machine learning into their applications. As AI moves from the future into the present, organisations not only want to gain insight into their own processes via classical process mining, they are also looking for practical support for the decision-making process, such as guidance on how to further optimise single process steps or efficiently eliminate any hurdles that still exist. By doing so, they can understand which influencing factors would be worthwhile tackling first.


Firms look to security analytics to keep pace with cyber threats

Implementing security analytics can take time and money, especially if a business is using outdated hardware and software. Gene Stevens, co-founder and CTO of enterprise security supplier ProtectWise, says many CISOs are finding it difficult to retain forensics for an extended period of time in a way that is cost-effective and easy to manage. However, his company has come up with an intelligent, analytics-oriented platform to tackle this problem. “With a memory of network activity, security teams can go back and identify whether they were compromised by an attack once it is discovered – and assess the extent of its impact,” says Stevens. However, traditional approaches are costly to scale and laborious to deploy.


New managed private cloud services hoist VMware workloads

CenturyLink's new VMware managed private cloud, CenturyLink Dedicated Cloud Compute Foundation, rearchitects its flagship private cloud onto Hewlett Packard Enterprise (HPE) hardware. It is cheaper and 50% faster to provision than its predecessor, which required multiple integration points across network, compute and virtualization from five vendors. That's typical with many private clouds that require users to coordinate technologies, either within OpenStack or earlier versions of VMware, said David Shacochis, vice president of product management at CenturyLink. VMware Cloud Foundation serves up an integrated stack with vSphere, NSX and vSAN, which means fewer moving pieces, improved self-service features and security control, Shacochis said.


Data storage in Azure: Everything you need to know

On Azure, things are different. Instead of having to manage data at an operating-system level, Azure’s object file system leaves everything up to your code. After all, you’re storing and managing the data that’s needed by only one app, so the management task is much simpler. That’s where Azure’s blob storage comes in. Blobs are binary large objects, any unstructured data you want to store. With a RESTful interface, Azure’s blob store hides much of the underlying complexity of handling files, and the Azure platform ensures that the same object is available across multiple storage replicas, using strong consistency to ensure that all versions of a write are correct before objects can be read. 



computer frustration man resized

What’s your problem? Survey uncovers top sources of IT pain

Cost-cutting has always been a focus for enterprise IT, but as the diversity and volume of data increases, complexity and storage sprawl are straining budgets. Indeed, budget challenges were a top challenge for 35% of IT pros surveyed at VMworld 2017.  A metadata engine can cut costs in several ways. First, it can automatically and transparently moves warm and cold data off high performance resources to dramatically optimize the use of an organization’s most expensive infrastructure. Second, it enables organizations to dramatically increase the utilization of their existing resources. With global visibility and control, organizations can view data location and alignment against SLAs, as well as available performance and capacity resources. This allows IT to instantly observe resources that might be nearing thresholds, and subscribe to alerts and notifications for these thresholds.


FBI's Freese Shares Risk Management Tips

Confusion over the definitions of "threat" and "risk" exist when IT security teams talk to members of the executive suite. One strategy security professionals may consider is approaching the discussion from a business perspective, instead of leading with fear, says Don Freese, deputy assistant director with the FBI's information technology branch. Freese, who served as a keynote speaker Monday at the ISC(2) Security Congress convention in Austin, Texas, noted that risks are measurable, providing that companies practice good security hygiene, such as logging network activity and taking inventory of the data that the enterprise possesses. In addition to those best practices, Freese also advises IT security leaders to consider the industry that they operate in and the type of data that would be desired by cybercriminals, or nation states.


8 Features of Fintech Apps that Appeal to Millennials

A new wave of fintech apps is set to hit smart devices over the coming years. A number of new startups, such as the digital-only bank Atom and the international money transfer service TransferWise, are going to revolutionize the financial industry. Fintech investments have grown exponentially over the last three years, and there are many opportunities for developers, investors and executives of “legacy companies” to ride on this wave. At the same time, of all the consumer demographics, Millennials are expected to be pivotal in driving changes. As a consequence, fintech developers are aggressively targeting them, tailoring apps to solve key pain points. Let’s look at some of the common features that software developers are currently including in their apps, along with those that are expected to become widespread, in order to appeal to Millennials.



Google's custom data center switch Jupiter

What’s Different about Google’s New Cloud Connectivity Service

Dedicated Interconnect is only one of a set of cloud connectivity options from Google; it’s designed for handling workloads at scale, with significantly high-bandwidth traffic of more than 2Gbps. You can also use it to link your corporate network directly to GCP’s Virtual Private Cloud private IP addresses. Taking your cloud traffic off the public internet and onto your own network range gives you more options for taking advantage of cloud services. “Networking technologies are enabling applications and data to be located in their best execution venue for that workload,” Traver noted. Like its competitors, Google Cloud Platform requires you to connect to one of several global peering locations, so as well as Google’s charges, you’re also going to need to pay your network service provider to reach Google’s peering points.



Quote for the day:

"The two most powerful warriors are patience and time." -- Tolstoy