May 04, 2016

An Introduction to Bitcoin and Blockchain Technology

Unlike traditional computer networks and payment systems, Bitcoin is not administered by any centralized authority or controlled by any rights holder. Instead, it was introduced to the world as an open source project. It may be utilized by any person, without fee, by downloading Bitcoin software and accessing the peer-to-peer network. These users collectively provide the infrastructure and computing power that processes and verifies transactions and information posted through that network and recorded on its decentralized ledger. A group of computer scientists and programmers volunteer their time toward upgrading and improving the Bitcoin software code, primarily through an open repository on the GitHub website.


Proactive DDoS Mitigation in Today's Threat Landscape

Cybersecurity threats have become more persistent and damaging than ever. Despite increased awareness of the network threat landscape, many organizations struggle to protect themselves against DDoS attacks. Download the eBook to learn about the latest cybersecurity threats and DDoS mitigation best practices: The current threat environment and what it means for your organization; How moving toward a more holistic, proactive approach can help ensure the availability and security of your business; and Why over provisioning bandwidth to absorb large attacks is an ineffective strategy


U.K. Considering Government Applications of Blockchain Technology

“Blockchains ‒ distributed ledgers, shared ledgers ‒ are digital tools for building trust in data,” explained Hancock. “Rather than a single central authority demanding trust and declaring: ‘I say this data is correct,’ you have the distributed consensus of everyone in the chain, saying in unison: ‘we agree that this data is correct.’” Hancock added that data held in the blockchain comes with its own history, and that history is a fundamental part of proving its integrity. “This fact is enormously powerful,” he said. While cautioning against considering the emerging blockchain technology as a generally applicable solution for all problems, Hancock emphasized that the technology offers efficient solutions for important use cases.


Why Fast Flux Networks

The basic idea behind a Fast Flux Network is to associate multiple IP addresses to a malicious domain name. These IP addresses are swapped in and out with extremely high frequency, may be in every 3 minutes, with the help of changing DNS records. As a result, a browser connecting to the same malicious website in every three minutes will see different IP address each time and connect to the actual malicious website via different infected computers every time. ... Fast Flux motherships are the main controlling elements behind the front end servers. They are similar to Command & Control or C & C servers, though they have much more features compared to the C & C servers. This mothership node is hidden by the front end servers, which make them extremely difficult to track down.


Educating Regulators a Priority, Say Blockchain Policy Experts

To help them better understand blockchain, Edge said, governmental organizations and companies should create sandbox environments in which they can safely experiment, then invite younger members to take leadership positions exploring the technology. "You’ll end up with an army of young people," he said, adding: "Before we can do those conversations, we have to do what Jamie’s talking about." Jonathan Levin, founder and CEO of Chainalysis, had his own strategy for doing exactly that – teaching policymakers at any organization, including large corporations about blockchain. Instead of pursuing an “army of young people,” Levin advocated that the organization should identify a single person with at least some bitcoin or blockchain knowledge, and offer them a position as a specialist.


How secure are cloud storage services?

Truth be told, cloud services are not as insecure as the occasionally screaming headlines make out. In fact, there's much to be said for the argument that Dropbox, iCloud, Google Drive and OneDrive have both the money and motivation to make their data stores much more secure than you could hope to achieve on your meagre budget yourself. So let's take a look at these four services, but first we need to get a few things out of the way. This isn't a review of these services, if you want to know what ins and outs of the cloud services, take a look at our article ... Because Google Drive uses the same Google account for login as Gmail, the danger was that everything was compromised as a result. It turned out, however, that the dump was of old phished passwords and at most 2% may have worked - but were all reset by Google anyway.


Businesses must address digital transformation security risks, says analyst

“When organisations look at digital transformation, they need to restructure so that information security is responsible for the security of everything, including the internet of things (IoT), the organisation’s operational technology (OT) and the business,” said Kuppinger. “The execution of information security needs to move to where organisations use IT, meaning IT departments will have to become more decentralised and services-based, while information security is independently responsible for security governance across everything,” he said. ... “Identity and access management is increasingly about … managing identities of everyone and everything in a connected world, and supporting organisations in their governance, risk and compliance [GRC] initiatives,” he said.


5 secure habits of the paranoid PC user

It makes much more sense to access the Internet through a virtual private network in which all outgoing and incoming traffic is funneled through an encrypted channel to a trusted Internet gateway. Another advantage of this strategy is how it masks your current IP address, which should further reduce opportunities for phishing. Fortunately, commercial VPN offerings such as VyprVPN and PureVPN abound for individuals and small businesses, and are typically priced at between $5 and $10 per month. Almost all of these services provide their own VPN client to log in to the correct servers with minimum configuration required. Affordability aside, some considerations when choosing a suitable VPN service include its performance in the region where you live or travel to, the number of simultaneous client devices it supports, the platforms it supports and its reliability.


Analytics-driven SDN and NFVi, CORD and M-CORD

The advent of new virtualized frameworks (cloud, NFV, and SDN) has changed telco hardware strategy to embrace open control interfaces and programmable data path elements. The cloud has given rise to commoditization of infrastructure using merchant silicon and commercial off-the-shelf hardware that are managed and orchestrated with software. NFV and SDN push the scope of software even more towards the abstraction, programming, manipulation and disaggregation of network functions. “Whitebox switching” in networking has emerged as a bare metal switch that is run with a hardware-agnostic network operating system and managed by a decoupled centralized controller. The cloud, NFV and SDN have shifted the industry’s attention from a high performance “closed” purpose-built systems to an “open” programmable hardware and flexible software architecture as the differentiator.


Unfreezing an Organization

Scaling a piece of software without forethought or using unsuitable architecture resulted in many of the enterprise systems that most of our companies run on: expensive to maintain, fragile, hard to replace and loaded with decades of technical debt. Scaling an organization without forethought or using an unsuitable assembly line process, resulted in the organizations that run our companies: expensive, fragile, slow, hard to replace, and loaded with decades of organizational debt. Rewriting a 20 year old antiquated payments system that processes a bank's payments or transforming an organization share the same characteristics: it takes much longer than originally promised, it’s incredibly difficult, and most senior leaders will shy away from it. At the same time, in a world of Uber, Wealthfront, and Fintech incubators, most organizations have no choice. The brave few are doing something about it.



Quote for the day:


"Designing your product for monetization first & people second will probably leave you with neither." -- Tara Hunt


May 03, 2016

Now or Never: The Ultimate Strategy for Handling Defects

Let’s stop using the backlog as a trash can. Having a longer queue of issues will increase the average lead time of our system. We could say that any backlog isn’t just a “first in/first out” queue and manage it that way, but managing our bug log demands time and energy. In my experience, the benefits of these activities with long bug logs are overrated. Just stop doing it. If a bug is critical enough but we haven’t fixed it, it will remind us about itself — don’t worry about that. Just recently, one of my teams had such a case. They knew about the problem, which appeared rarely in unpredictable situations. After a quick analysis, the team decided it was not important enough (below the line) because of its infrequency and closed the issue. However, the bug reappeared in several weeks under different conditions.


Will Fintech Destroy The Banks?

Whether the conversation starts with a vague reference to bitcoin, blockchain or crowd-funding, we're increasingly hearing from clients who are curious or worried about the implications of "Fintech" on their business or investment portfolio, particularly as it pertains to the banks.  Although wrapped in jargon and buzz words, "fintech" or financial technology, is simply the application of technology to improve the efficiency or delivery of financial services, at scale. Put this way, the concept goes from complex and obscure to obvious and unavoidable. There are, however, different themes to the innovation that could be very disruptive for both financial institutions and their customers:


I'm Calling it: Social Networking is Over

Confusion about the difference between social networking and social media is why most people haven't noticed the decline of social networking. People don't stop to think about the difference. Social networking is personal content. Social media is professional content. The sharing of social media -- professionally produced videos, articles, podcasts and photos -- is gradually replacing the sharing of personal content about one's life. For example, as you read my column, this article is being shared on Facebook, Twitter, Google+ and other so-called "social networking" sites. But that isn't social networking; it's social media. Micro-blogging, micro-schmogging. No matter what you call it, Twitter is included in every roundup, comparison or article about social networking. It's universally included in the "social network" category.


Strengthening authentication through big data

The idea is to unobtrusively gather information from several sources, including user behavior and device usage, to create a profile that is unique to the account owner and cannot be stolen or replicated by fraudulent users. The next steps would be to use the profile to detect activities that hint at malicious activity and only then initiate extra authentication steps to make sure the account hasn’t been hijacked or compromised. This model has many strengths. It’s not something you lose, such as physical tokens; it doesn’t require extra memorization efforts; it can’t be stolen or replicated, such as passcodes, or even fingerprint and retina scans; and, above all, it’s not cumbersome and it doesn’t introduce extra complexities to the user experience.


Forensics expert fights crime with digital weapons

As the volume of electronic data grows exponentially and the number and type of devices "owned" by people — such as smartphones and tablets — increases, the need to be able to identify, collect, consolidate, filter and analyse relevant data, compounded by "peripheral data" such as CCTV, physical access control logs, satnav or computer log files, becomes even more important. Historically, during an investigation numerous techniques and tools would be used to attempt to piece together the various pieces of the puzzle, especially around chronology. For example, when trying to link a call on a mobile phone with a person having just entered a secure office, against an unauthorised log onto a computer and the copying of files to a remote device.


The Rise of Threat Intelligence Gateways

Why is threat intelligence gaining momentum? Security professionals know that since they can’t block every conceivable cyber-attack, they need to collect, process, and analyze all types of internal and external security data to improve their incident detection and response capabilities. Many also want to use threat intelligence more proactively for threat prevention. In fact, 36% of enterprise cybersecurity professionals say that their organizations intend to use threat intelligence feeds to automate remediation actions over the next 24 months. ... When threat intelligence points to bad IP address, URL, or DNS lookups, why not simply block them from the get go? Unfortunately, this hasn’t always been easy in the past as it involved normalizing disparate threat intelligence feeds, building custom dashboards and rule sets, integrating various network security devices, etc.


Is There Really Such a Thing as a “Hybrid Agile” method?

Are there projects that don’t require business cases and annual budget planning? Probably. But not many in larger organisations. So finding a way of making the existing waterfall processes more lean, will enable us to shift from “Hybrid Agile” to “real Agile”. The go-live preparation is different. I think there are many technologies for which we already have good answers that allow us to go-live as required using Continuous Delivery practices. For other technologies, COTS come to mind, we will likely continue to see some waterfall validation and testing practices being used before we can go live, but as the technologies and tools evolve this will become shorter and shorter until this final phase disappears.


IT leaders pick productivity over security

Security is on the top IT leader's mind, especially as hacks become more frequent, sophisticated and malicious, but the report also uncovered some shocking truths about cybersecurity in the enterprise. The report showed major flaws in how businesses and IT leaders approach security, and it boils down to a lack of communication between the C-Suite and IT leaders, as well as a general frustration with how security slows down overall productivity in the company. But just because security might bog down productivity, or IT leaders and executives suffer from a lack of communication, businesses need to remain vigilant regarding security. Jack Danahy, CTO and co-founder of Barkly, says efficiency should be redefined. "Good security does not bog down efficiency.


The race to create smart homes is on

Two things are needed to make homes truly “smart.” First are sensors, actuators and appliances that obey commands and provide status information. There are already hundreds if not thousands of smart home products on the market. These have evolved in recent years beyond simple door sensors and light switches to smart thermostats such as Nest and voice command devices such as the Amazon Echo. Second are protocols and tools that enable all of these devices, regardless of vendor, to communicate with each other. However, this is a major undertaking and it won’t happen overnight. In the meantime, smartphone apps, communication hubs and cloud-based services are enabling practical solutions that can be implemented right now.


Enterprise UX: Past, present and future

Looking ahead, there are multiple technology developments underway that will affect how user experiences -- for both consumers and business users -- are created. Wearables, IoT devices, virtual and augmented reality, and increasingly sophisticated artificial intelligence (AI) will all profoundly change the way humans and computers interact with one another, and with the world around them. Gesture and voice control, for example, are set to play an increasingly important roles. The emerging umbrella term for where all this is heading is the 'post-app' world of pervasive computing, where desktop WIMP and mobile touch-driven interfaces are augmented or superseded by more 'natural' methods of user interaction.



Quote for the day:


"I never look at the glass as half empty or half full. I look to see who is pouring the water and deal with them." -- Mark Cuban


May 02, 2016

The expanding landscape of exploit kits

If you have systems and files being encrypted or file share becomes encrypted, that’s a huge impact. Dozens of hospitals have been attacked recently, and for some it has taken them days to recover. That means massive down time, rescheduling major surgeries. It’s literally putting lives at risk,” Williams said. Through their networks in the dark web, nefarious actors are informed that new exploits are seen in the wild, making them aware of even zero-day vulnerabilities before the general public. Leonard said, “Under responsible disclosure, a researcher will identify the use of a brand new exploit script to a vendor. The vendor then releases a patch that can be applied to the business.” Businesses, though, struggle to apply those patches expeditiously. The level of sophistication and the relative ease with which criminals can access exploit kits compromises business operations and has security teams on overdrive trying to expedite the patching process.


Ways to craft a better enterprise IT security roadmap

You need to be able to detect those threats and attacks. And detecting a threat, a vulnerability and an attack are three separate things, and that's important to understand. Lots of companies sell you vulnerability detection. Vulnerability detection is basically like telling you which doors you have unlocked. Attack detection is telling you when the burglar is coming through your door. And threat detection is, "Hey, the burglar has been seen on your street with a big bag of loot and he's heading for your house." So those are three separate things and, ideally, you want to know all three things. And that distinction is important because sometimes people say, "Well, I do vulnerability scanning so I'm covered." No, that just tells you which doors are unlocked. Maybe the burglars are getting smart enough to come in through the chimney.


Unified Storage That Can Sync and Share

Many siloed storage, data management, file sync and share and security solutions exist to provide for these individual requirements, but are typically cobbled together in costly, inefficient and unreliable ways. Nexsan UNITY addresses all of these requirements in a single unified solution which delivers high performance and multi-site collaboration at LAN speed to support business continuity and disaster recovery processes as well as mobile access to primary storage data. UNITY's patented technology is designed to support all devices – from mobile devices to tablets, laptops and desktops running Android, iOS, Mac and Windows– and provides a secure connection to data stored and managed within the enterprise totally eliminating the drudgery of using unpopular and aging VPN technologies.


Mobile Banking Trends Will Lead Change in Banking in 2016

Mobile banking offers many advantages: Users can authenticate their identity and open new accounts, sign up for direct deposit, pay bills, take out loans, and deposit checks by photographing them, all from their mobile devices. Mobile apps such as Venmo let users make and share payments instantly, and Quicken Loans’ Rocket Mortgage even offers a mortgage approval in eight minutes through a process that automatically collects pay and credit information and requires minimal typing by the user, letting them sign their name right from their mobile device. In the U.S., Simple and Moven are the leaders in developing banking apps that allow people to pay by mobile, track their expenditures, and save for future goals in electronic envelopes — whether it’s for large expenses such as vacations or a down payment on a house, or for smaller things like a tattoo or a bike tune up.


What's Wrong with Open Data Sites--and How We Can Fix Them

The second non-obvious design problem, which is probably the most important, is that most open data sites bury data in what is known as thedeep web. The deep web is the fraction of the Internet that is not accessible to search engines, or that cannot be indexed properly. The surface of the web is made of text, pictures, and video, which search engines know how to index. But search engines are not good at knowing that the number that you are searching for is hidden in row 17,354 of a comma separated file that is inside a zip file linked in a poorly described page of an open data site. In some cases, pressing a radio button and selecting options from a number of dropdown menus can get you the desired number, but this does not help search engines either, because crawlers cannot explore dropdown menus.


Here's why analytics is eating the supply chain

This is not to say that supply-chain professionals are newcomers to the world of analytics. On the contrary: Demand forecasting, for example, has "been around forever" and relied heavily on data, said Paul Myerson, an author and professor of practice in supply chain management at Lehigh University. What's new today are the tools. "Today we have very visual tools that are much quicker to run," Myerson explained. "What used to take overnight can now be done in minutes." New software is also enabling more collaboration among partners, including key customers and suppliers. Point-of-sale data provides better insight for everyone involved, leading to better forecasting decisions. "It's about agreeing on forecasts and collaborating on inventory throughout the supply chain," Myerson said. "It really improves efficiency, cost and quality, and not just for manufacturers."


Embracing Agile

When we ask executives what they know about agile, the response is usually an uneasy smile and a quip such as “Just enough to be dangerous.” They may throw around agile-related terms (“sprints,” “time boxes”) and claim that their companies are becoming more and more nimble. But because they haven’t gone through training, they don’t really understand the approach. Consequently, they unwittingly continue to manage in ways that run counter to agile principles and practices, undermining the effectiveness of agile teams in units that report to them. These executives launch countless initiatives with urgent deadlines rather than assign the highest priority to two or three. They spread themselves and their best people across too many projects.


Cloud Economics – Are You Getting the Bigger Picture?

Most enterprises have hardware utilization rates significantly below 20% because of the excess capacity required to handle peak demand. As such, many companies carry up to 5 times the required hardware, networking, and data center space during steady state business cycles. If their computing demand is spiky, utilization rates outside of peak cycles are commonly below 10%. As a result, enterprises are spending much more on compute and storage than is required. Figure 1 depicts the traditional model where cloud shifts fixed CapEx expenses to variable OpEx expenses. To understand the full value of cloud for your enterprise, you must look beyond the CapEx vs. OpEx benefits and assess the other value drivers at play.


Zen and the art of big data digital IoT transformation

Despite the obvious levels of machine automation in the Internet of Things — and the machine-learning capabilities that some of these machines will benefit from — we must also keep the human factor top of mind. The first beneficiaries of efficient data management and effective data analysis will often be the worker-stakeholders within the business. When we empower employees with intelligence to be able to perform their jobs better (with better machines and processes around them), we ultimately derive greater business value at the end of the day. For want of more tangible examples here, if big data digital IoT transformation is focused on plant machinery, then we could see turbine sensors reporting performance statistics to enable more efficient predictive maintenance. Our business model states: less downtime + better serviced machines = greater business value.


PCI's new rules focus on the chiefs

Troy Leach, the chief technology officer for the PCI Security Standards Council, said in an interview that he finds this lack of involvement problematic and that he fought for the new rule. The rule itself sounds innocuous and possibly even obvious, but there's a lot more to it. The rule, within Requirement 12, mandates that "executive management establish responsibilities for the protection of cardholder data and a PCI DSS compliance program." To Leach's mind, that means that they have to dig in and assume responsibilities for payments security and stopping the simple act of delegating it away. "The intent is that we at least push the visibility to the executive level," Troy said, referring to the full text of the new guidelines. "We need for there to be different C-levels aware of compliance responsibilities."



Quote for the day:


"An organization's ability to learn, and translate that learning into action rapidly, is the ultimate competitive advantage." -- Jack Welch


May 01, 2016

Blockchain Platform Emercoin is Moving Beyond Cryptocurrencies

Now, Emer is a platform that offers two primary product umbrellas, a blockchain-based platform for a variety of of services including security, advertising, and legal. It is also a payment services unit in Emercoin which also runs through the Emer platform. While the name Emercoin might be recognizable from the standpoint of being a cryptocurrency used to send and receive payment and is ultimately tradeable, it is just a piece of what makes the Emer platform so valuable in the services it offers and the problems it can solve. For developers seeking to build products or solutions based off of the Emer platform, they’ve provided a quick-start guide for deploying an Emercoin wallet on an Ubuntu instance within Microsoft Azure.


IT leaders inundated with bimodal IT meme

At first glance, it certainly seems to make sense, and from my perspective bimodal is intellectually useful as a concept in creating a dividing line between legacy technology efforts that change much more slowly and must be managed more carefully on one side and high velocity new digital projects that must more faithfully match the rate of exponential change of market conditions today, while also effectively applying the latest technologies and techniques. Where the concept breaks down, as I explored in my original critique of bimodal, is in actual execution. The real world of technology and the activities that make it bear fruit cannot be neatly compartmentalized into a dual structure. Not only do the actual needs and demands of individual IT initiatives vary widely, the team skills and processes on the ground are unique for nearly every project as well.


Quantifying Benefits of Network Virtualization in the Data Center

Network virtualization allows IT organizations to deploy their network resources whenever and wherever they need them. IT can rapidly add the capacity to make sure the network delivers the performance and reliability demanded by evolving data center environments. NV provides improved, centralized management and offers microsegmentation to improve data center security and increase compliance. For organizations facing network upgrades, the option to deploy network software on white-box switches can result in significant capex savings. NV deployment is becoming mainstream in leading data center deployments. Organizations are likely to see strong ROI benefits from operational efficiencies – although this ROI is challenging to quantify.


Data Tower: a Data Center for Saruman

As our society increasingly relies on digital services, the unique problems in data center design attract attention from designers outside of the data center world, who propose unorthodox design ideas in attempts to envision new, better ways to build data centers in the future. Another example of this trend is an Estonian startup called Project Rhizome, which is thinking of ways to better integrate data centers into densely populated urban areas. eVolo, an architecture and design journal, has held its futuristic skyscraper design competition since 2006. The other two winners in this year’s contest were a design that proposes a continuous horizontal skyscraper around New York City’s Central Park, which is sunken to create more space for housing with unobstructed views, and a vertical control terminal for drones that would provide services to New York residents.


Data Storytelling: The Essential Data Science Skill Everyone Needs

It’s important to understand how these different elements combine and work together in data storytelling. When narrative is coupled with data, it helps to explain to your audience what’s happening in the data and why a particular insight is important. Ample context and commentary is often needed to fully appreciate an insight. When visuals are applied to data, they can enlighten the audience to insights that they wouldn’t see without charts or graphs. Many interesting patterns and outliers in the data would remain hidden in the rows and columns of data tables without the help of data visualizations.


A Google executive creative director explains what he does for a living 

While Vranakis says there are many agencies out there that have taken a more modern working approach he thinks - without wanting to make sweeping statements about the industry - there are a couple of things that separate the Lab from a traditional agency setup. One is the way in which the creative directors and other experienced executives at agencies often take credit for a team effort. He points to the annual Gunn Report, which showcases the year's most successful campaigns. "I think there are still structures that incentivize the contributor as opposed to the group," Vranakis said. "If you look at Gunn Reports and things like that, they all name individuals. I know that it's just business. If you win an award and your name is attached to the award, you'll get a pay-rise as they'll need to keep you as you'll be headhunted."


Identity Management: Where Cloud Security Falls Short

It's obvious that thinking outside the traditional security perimeter is necessary. Less obvious is how much "controlling the access to data" will contribute to firms being able to adopt cloud services and technologies more safely, Yeoh and Baron continued. The survey identifed seven types of perimeter-based security products, and asked respondents how many of them were in use in their organizations. As the table below shows, antivirus, anti-spam, and Virtual Private Networks were the top three solutions in use by respondents. ... When the question turns to which access measures are in place for partners, outsourced IT, and other third parties, the picture changes quite a bit. Only 62% of respondents said they had privileged access management in place for such users, 25% had application to application password management, and 32% had secure password storage.


Why image recognition is about to transform business

Not every company has the resources, or wants to invest in the resources, to build out a computer vision engineering team. Even if you’ve found the right team, it can be a lot of work to get it just right, which is where hosted API services come in. Carried out in the cloud, these solutions offer menus of out-of-the-box image recognition services that can be easily integrated with an existing app or used to build out a specific feature or an entire business. Say the Travel Channel needs “landmark detection” to show relevant photos on landing pages for specific landmarks, or eHarmony wants to filter out “unsafe” profile images uploaded by their users. Neither of these companies needs or wants to get into the deep learning image recognition development business, but can still benefit from its capabilities.


2016: The year of application layer security in public clouds

In 2016, private datacentres will reflect public cloud security realities and secure internal network traffic as well. Encrypted layers of security within a datacentre or public cloud network will help organisations control access and encryption to limit malicious east/west movement. This ‘application segmentation’ at the application layer will add security within the network to strengthen existing datacentre hardware and virtualisation layer security. Enterprise application owners will realise the value of true virtual networks in concept in practice. No more will network operators believe a VLAN is actually virtual! The limitations of the physical network architectures will be magnified once enterprises see the difference between an underlay for bulk transport and an overlay for application specific use-case tuning. The glaring security holes in physical networks once obfuscated will reveal themselves.


Controlling Hybrid Cloud Complexity with Containers: CoreOS, rkt, and Image Standards

CoreOS Linux is an interesting example of system architecture decisions informed by the reality of container clusters. If the container makes applications self-contained, portable, standard units, the operating system should adapt to empower this dynamic use case. In CoreOS, the operating system and basic userland utilities are stripped to their bare minimum and shipped as an integral unit, automatically updated across a cluster of machines. CoreOS may be thought of as a “hypervisor” for containers, that is itself packaged in a discrete and standard way. Utilizing this single image distribution, CoreOS foregoes individual package management in favor of frequent and coordinated updates of the entire system, protected by a dual-partition scheme providing instant and simple rollbacks in the event of trouble.



Quote for the day:


"When data disproves a theory, a good scientist discards the theory and a poor one discards the data." -- Will Spencer

April 30, 2016

Advice To Fintech Firms: How To Partner With Banks

Banks need help, and they have recognized that some of that help will come from Fintech firms. That is why so many banks have created incubators and accelerators. Banks and Fintech firms need each another. It took a while to sink in, but most players now agree. But Fintechs struggle with how to partner with banks, and vice versa. What will a good Fintech partner look like? Without a doubt, banks are looking for partners. They want companies that will share their business goals and understand their vision. They want partners who will measure success the way they do. But what partnering model are we talking about? There are several existing or emerging models.


The Open Group IT4IT Architecture Offers a New Direction

Within IT, there are places where we want to move in a more agile way -- where we want to move faster. There are also certain activities where waterfall is still an excellent methodology to drive the consistency and predictability that we need. A good example of that comes with large releases. We may develop changes or features in a very agile way, but as we move towards making large changes to the business that impact large business functions, we need to roll those changes out in a very controlled, scripted way. So, we take a little bit different look at Bimodal than some companies do. Your other question was on Shadow IT. One of the things that we have challenged a lot over the last year or so is this concept the role of the IT organization relative to the rest of the enterprise.


The API Economy: A Big Ball of CRUD

Services and APIs are only loosely-coupled in conceptual architecture, but not in the real-world where people still have to manually discover and integrate them at the Application and Process layers. This is the reason why Service Orientation has yet to have delivered “Business Agility”. From that high-level Enterprise Architecture perspective (as opposed to the bottom up IT view) it’s not useful or scalable to be stuck manually considering individual APIs. It would be better to have an abstraction, a model for looking at all of these endpoints as objects that expose functionality, but hide their complexity and automate management of the end-to-end API contract lifecycle in order to advance the Service Orientation architectures.


Key Differences Between A Poiny-To-Point Vs. Enterprise-Wide Data Integrity Solution

Extract, Transform and Load (ETL) systems commonly integrate data from multiple applications and systems and are typically developed and supported by different vendors or hosted on separate computer hardware. The disparate systems containing the original data are frequently managed and operated by different employees as well. It is likely you have found some point-to-point tools in your tool kit to help with this enormous data corralling challenge, but what about maintaining the integrity of the data as it moves away from your original system downstream? An even more elusive question to answer is do you need enterprise visibility into the overall integrity of your key business data?


How artificial intelligence is changing the way illness is diagnosed and treated

The industry's interest in AI, Rajan says, has been driven both by rising costs and increasing volumes of data. "There isn't necessarily the capacity to capture and process and understand all of it. I think AI, particularly a lot of early solutions, are targeting those issues -- being able to take large volumes of data, put it through levels of processing that can allow some level of relevancy to crop up to support decision making and influence the course of care." The aim is for AI systems to do what doctors can't always: keep up on every detail of every patient's visit to every specialist or hospital, as well as each pertinent new piece of research, disease outbreak, and public health recommendation. The system must not only digest all that information, but also factor in the patient's symptoms and then recommend a diagnosis or course of treatment that takes all those elements into account.


Big Data & Logistics: 7 Current Trends to Watch

“Hey ‘Big Data’ is just a big fuzzy word for me” quoted a Vice President of an Innovation Center at a big logistic company back in early 2015. Just one year later, he not only has to admit that ‘Big Data’ is the next big revolution, but has already applied big data technology to dramatic effect significantly growing the business and reducing costs. In fact, he’s been so successful that he’s been given the funding for a new Machine Learning department! UPS’ 1 billion investment in big data more broadly blows the whistle for all logistics companies all around the globe to get very serious about becoming data-driven or otherwise be in fear of being wiped out. The delta being created by those who have been quick to embrace big data is growing rapidly.


Phil Zimmermann speaks out on encryption, privacy, and avoiding a surveillance state

Talking of Snowden, Zimmermann notes with a certain amount of pride: "Snowden got his hands on some documents that showed some products that [the NSA] had broken the crypto [on]—and none of my stuff was on the list." Silent Circle's Blackphone device runs a security-toughened version of Android it calls PrivatOS. Calls are encrypted end-to-end which means even the company itself can't hand over the details to anyone. "We have no access to it. None. We can't disclose what we don't have access to," the company says. Since the V&A exhibition opened, the Blackphone has been added to the collection of a second museum—the International Spy Museum in Washington DC. Its 'Weapons of Mass Disruption' gallery explores the challenges facing the intelligence community in the twenty first century.


Reflecting software architecture evolution in your stack

While the disconnect between the conception of "business processes" as being "business process flows" and event architecture -- or microservices architecture -- seems obvious, it's not being adequately reflected in most enterprise architecture methodologies. The business process output of EA is often prestructured into business process flows and leans toward workflow thinking when translating EA requirements to IT requirements. It is possible to "retroject" an understanding of modern software architecture evolution and design approaches based on the cloud and microservices into today's EA methodologies, but this is difficult to do in a consistent and organized way.


International IT trade group urges firms to prepare for GDPR

“There are challenges ahead. A lot of companies will have their work cut out for them to be compliant in time,” said Bridget Treacy, partner at Hunton & Williams. “All organisations that have not done so already, really have to start thinking in very pragmatic terms about what the GDPR means for the business and how they are going to handle their data assets, because two years is not much time,” she said. The final alarm bell has been sounded, said Stewart Room, cyber security and data protection partner at PricewaterhouseCoopers (PwC). “There are no more alarm bells after this. There is no more pretending. All organisations that have not started preparing now need to start taking this seriously,” he said.


IT guru Batya Friedman talks tenets of value-sensitive design

There's nothing in value-sensitive design that's about a specific technology. It's about how do we foreground what's important to people in the tools and technologies and infrastructure we build. Most of my work has focused on information technology, but other people have applied it to wind turbines, to designing processes for customs in major ports, for transportation systems. ... When you're designing a system, who do you focus on? The language in the field is to talk about users and user-sensitive design. So when people design, they think about who is going to use the technology. We have methodologies for doing usertesting, but we know that others are stakeholders, too. So one of the key changes is to bring other stakeholders in to make sure they're considered along with the users.



Quote for the day:


"Just because something doesn't do what you planned it to do doesn't mean it's useless." -- Thomas A. Edison


April 29, 2016

Cyber security in Belgium will gain prominence after terror attacks

There is some good news for the country’s cyber security. Belgium is one of the countries least affected by online banking trojans.  And there is a very good reason for that, according to Eddy Willems, security evangelist at Gdata Software. “Most Belgian banks use advanced authentication system, which makes it more difficult for cyber criminals to obtain the required authentication details to get entrance to the victim’s bank account,” he said. ... Not such good news for Belgium is that it and the other Benelux countries, the Netherlands and Luxembourg, have seen a dramatic increase in the number of ransomware incidents. More incidents have been reported in February 2016 alone than in the last six months of 2015, according to research by security supplier Trend Micro.


In the digital enterprise, everyone is a security newb

In the digital enterprise, protecting critical data has changed. Communication is the missing ingredient because security teams don't have the information they need for the other business leaders who are focused on different objectives, like sales goals or the customer experience. "Those department heads are so concerned about keeping their own systems up so that they can continue bringing in revenue, that they overlook security. For example, the managers of a POS system do not want to have their IT guy take the system offline for an hour to fix a patch during Black Friday," Stolte said. In order to best defend against the threats of malicious actors, leaders across all departments need to become more security savvy. "Line of business and application owners, those who manage assets that contain valuable information, must first recognize that the information they manage is of high value and they must communicate with the security team," Stolte said.


IT performance management pegged for increase of virtualization tools

"There are [fewer] IT organizations using cloud services than everybody expects," said Edward Haletky, CEO and principal analyst for The Virtualization Practice LLC. There has been a large increase in shadow IT, wherein cloud services are purchased ad hoc by workers, but since IT pros are not involved with these unknown services, they aren't factored into decisions about what management tools to buy. Many companies have just started to branch out into the cloud. MetLife Insurance, for example, started with development platforms and has moved to putting new and some existing apps in the cloud, but most of its IT operations are still in owned and hosted data centers, according to Tony Granata, assistant vice president of capacity performance & monitoring engineering at the New York-based insurer.


Rip up the script when assembling a modern security team

Hiring analysts who’ve worked at the same companies or attended the same schools means you may end up with a team that approaches security issues in a similar manner. If they all think alike, they’ll probably miss the same security blind spots. ... look for people who have worked in different companies and industries and have experience fighting a variety of threat vectors. Ideally, your team will include someone with either a military or government background. They’ll have a completely different way of looking at security, forcing your company out of its comfort zone. Military personnel are often familiar with nation-state attacks and malicious intent and understand how complex offensive operations work. And with hackers launching advanced attacks against companies, people who have experience dealing with these threats can apply their knowledge to defend a business.


Don't overlook these two hidden risks to your corporate data

The data your SMB partners have in hand may seem minimal, but it's still critical corporate data. Contact information and services rendered may be valuable to an individual who hacks the SMB's network. As an InfoSec professional, you know what measures you have in place—but what about the SMB? The extent of its security depends upon available resources and what's affordable for it to implement. ... When it comes to internal colleagues exposing sensitive corporate data, it's all about timing and pulling the emotional strings. Along with having the technical skill set to spoof a business's email address, the attacker executed the data breach beautifully by understanding the time of year and knowing who to target at a busy time. The accountant's inbox was potentially flooded with deadline notifications and requests, which created a stressful environment. This makes for an easy target.


The Holistic Approach: Preventing Software Disasters

Understanding each kind of source code and scripts, interpreting the configuration files, evaluating the value of variables throughout the execution cycle for finally piecing all these findings together and reverse-engineering the system blueprint gives CIOs an “X-Ray view” into the inner workings of their organization’s software systems and empowers the CIO to make data-informed decisions to fortify overall software quality. ... But looking at the unit-level source code is not everything, it’s just the beginning - specifically with modern architectures where loose coupling between the different layers is a must. Hence CIOs must also X-Ray the “glue” between software layers and components, which is sometimes defined in configuration, property files or annotations stored directly inside the source code files


10 Free Tools For API Design, Development And Testing

The rise of RESTful APIs has been met by a rise in tools for creating, testing, and managing them. Whether you’re an API newbie or an expert on an intractable deadline, you have a gamut of services to help you get your API up and running quick, and many of them won’t cost you a dime. Following is a sampling of free services for working with APIs: load testers, API designers, metrics collectors, and much more. Some are quick and dirty applications to ease the job of assembling an API. Others are entry-level tiers for full-blown professional API services, allowing you to get started on a trial basis and later graduate to a more professional level of (paid) service if and when you need it.


Is There a Need to Redesign Cyber Insurance?

As insurers increasingly focus on operational risk — that is, failure due to systems, processes, people and external events — as a key element of managing their capital adequacy and solvency, how will the regulators and insurance commissioners view the potential increase in the risk of someone infiltrating an insurer’s own site through some form of remote device? Overall, there seems to be agreement that prevention is better than cure, but where cyber crime happens, it is critical that companies carry appropriate insurance cover. Cyber insurance cover has been around for a decade or so, but as cyber crime has developed, then doesn’t insurance cover also need to mature? With policies provided by some major insurers giving cover to $100m, isn’t it time to think about whether this is enough?


You'll soon be using GPU as a Service

As an example of this new wave, AMD and AP are collaborating to bring immersive experience to news and storytelling. This can significantly enhance the ability to get information to content viewers, while also providing a more concise way to impart information, including the ability to see multiple perspectives, exhibit full dimensional accuracy, get a better sense of time, etc. Although still a niche market, this will help accelerate the adoption of VR clients. In addition to VR clients, the need to process immersive information means that there will be a significant need for high performance graphics processors -- not only at the individual server level, but available as an on-demand service based in the cloud. GPUs as a Service will expand greatly over the next 2-3 years, and will eclipse the PC GPU market in sheer numbers of units.


Production Like Performance Tests of Web-Services

Tests should always keep the end user view in mind to ensure that the software meets with acceptance on the part of the users. But how to test web services which are not directly customer-facing, and in particular, how to performance test them in a meaningful way? This article outlines performance test approaches that we have developed and proven to be effective in the company HERE, which is a leading location cloud company.  ... Tests should be created with knowledge regarding the end user so that they are effective and risk-based. Because of this factor, as well as release techniques such as canary releases and feature toggles, the line between tests run prior to the release and of the released software on production becomes blurred.



Quote for the day:


"Fear causes hesitation and hesitation will cause your worst fears to come true." -- Patrick Swayze


April 28, 2016

Vulnerability in Java Reflection Library Still Present after 30 Months

The ability to load classes dynamically at runtime using custom ClassLoaders has created the opportunity for a number of applications that wouldn’t be possible otherwise, but unfortunately it has also created a number of security concerns, particularly around class impersonation. A developer could, in theory, create a custom ClassLoader that loads a compromising implementation of the primordial class java.lang.Object, and use this custom Object in a Java application. ... When the issue resurfaced in March 2016, the latest available version at the time, 8u74, proved to be vulnerable. Since then, Oracle has released three updates for Java, namelt 8u77, 8u91and 8u92. However, judging by their release notes, none of those seems to have addressed the problem. 


Docker on Windows Server 2016 Technical Preview

To build and run your first Windows container, get Windows Server 2016 TP5 running, begin writing Dockerfiles for Windows, share images on Docker Hub and don’t hesitate to reach out with questions or feedback on the Docker Forums. ... Docker and Microsoft have come a long way since the 2014 partnership announcement of the Windows Server port of Docker engine, through the first publicly available version, up to today’s release. This journey also sawJohn Howard from Microsoft join the ranks of core Docker maintainers. We’re proud of the progress we’ve made to empower developers and ops teams using Windows with Docker’s proven tools and APIs for building, shipping and running containers and that we can help bring together the Windows and Linux communities with a common toolset for shipping software.


Paying ransomware is what ills some hospitals

Data backups are the key to surviving ransomware attacks. But some hospitals and physician practices don’t back up their data at all. This lack of security awareness puzzles McMillan. “It’s possible is that security is still not seen as a critical business function” in those organizations, he suggests.  Even if a hospital or a physician group does back up its data, it might do so only on a nightly basis. So, if a ransomware attack occurs and the organization uses its data backup to continue operations, the database will be missing everything that has been entered into the system since the previous evening, notes Gibson. That’s much better than nothing, but it will still send clinicians scrambling.  Many hospitals do near-real-time backups of data on mirrored servers. In case one server goes down, the other can take up the slack.


Technologically Constrained Banks Face A Challenge From Agile Fintech Firms

“Banks still struggle with legacy systems and with their culture.” One European bank partnered with a fintech firm on a project. Eighteen days later the technologists had a app and a proof of concept, while the bank was still struggling with deciding who should be in the room to meet with them. Interviews with bank CXOs revealed some startling contrasts with some large banks realizing the need for change while some regionals and super-regionals don’t think fintech innovations will impact them. “Banks are underestimating the value fintech firms provide in delivering a good experience and efficient service, as well as their potential influence on all areas of banking,” the report said. “From the customers’ perspective, fintech firms have value in being easy to use (81.9 percent), offering faster service (81.4 percent), and providing a good experience (79.6 percent).


Man jailed for failing to decrypt hard drives

"His confinement stems from an assertion of his Fifth Amendment privilege against self-incrimination," wrote the man's lawyer, Keith Donoghue. The US Constitution's Fifth Amendment is designed to protect people from being forced to testify and potentially incriminating themselves and states: "No person shall be... compelled in any criminal case to be a witness against himself." The Electronic Frontier Foundation, which campaigns for digital rights, said: "Compelled decryption is inherently testimonial because it compels a suspect to use the contents of their mind to translate unintelligible evidence into a form that can be used against them." The man's appeal also contends that he should not be forced to decrypt the hard drives because the investigators do not know for certain whether indecent images are stored on them.


Singapore Is Taking the ‘Smart City’ to a Whole New Level

“Singapore is doing it at a level of integration and scale that no one else has done yet,” says Guy Perry, an executive of the Los Angeles engineering design firm Aecom who studies “smart city” technologies. It helps in Singapore that government- or state-owned companies own or control many aspects of daily life, including public transport networks and housing. More than 80% of Singapore’s 5.5 million people live in government housing. And while Singapore is a democracy, it has always been dominated by a single party whose control of the system means it can move quickly. Leaders also see a chance to pioneer applications for export. The market for smart-city technology in Asia alone will reach US$1 trillion a year by 2025, according to IDC Government Insights, a unit of International Data Corp., the Framingham, Mass., research firm.


Why Won’t They Pair?

The organizational challenges continue from the physical equipment to how developers are rated for recognition, raises and promotions. If an organization stack ranks their employees, the chance of developers learning to pair effectively is severely hampered. In many cases, the developer wants to be seen as the super hero thereby raising their rank above their peers. Performance reviews are another blocker. Few companies recognize teamwork as a valued skill and instead look for the ‘super hero’ who can come in to save the day during a crisis. Further, organizations that consistently work in a tactical fire-fighting mode will struggle to see the value that comes from pairing where developers share knowledge of technical and domain expertise.


BIP 75 Simplifies Bitcoin Wallets for the Everyday User

One of the main downfalls of BIP 70 is that it doesn’t work well for P2P payments. While it gets the job done for transactions between a customer and a merchant, Bitcoin wallets are unable to receive payment requests when they’re offline. Store-and-forward servers can be used to forward new payment requests to wallets when they come online, but this setup creates new privacy and security concerns. BIP 75 is an attempt to solve this issue by encrypting all communication in the Payment Protocol end to end. “By adding encryption at the application layer we create secure private communications, even in the case where there is a store and forward server for mobile or desktop wallets,” Netki founder and CEO Justin Newton, who co-authored BIP 75, told Bitcoin Magazine.


Ransomware-as-a-service is exploding: Be ready to pay

It starts with a fast click on a link in a harmless-looking email. Then your PC slows to a crawl. A message suddenly pops up and takes over your screen. "Your files and hard drive have been locked by strong encryption. Pay us a fee in 12 hours, or we will delete everything." Then a bright red clock begins counting down. No antivirus will save your machine. Pay the fee or lose everything. You're the latest victim of a ransomware attack. The scary thing is, you're not alone. The ransomware market ballooned quickly, reported TechRepublic's Michael Kassner, from a $400,000 US annual haul in 2012, to nearly $18 million in 2015. The average ransom—the sweet spot of affordability for individuals and SMBs—is about $300 dollars, often paid in cash vouchers or Bitcoin.


Cyber Attacks on Small Businesses on the Rise

Almost half of cyber-attacks worldwide, 43%, last year were against small businesses with less than 250 workers, Symantec reports. The FBI reported last summer that more than 7,000 U.S. companies of all sizes were victims of cyber hacks via phishing email scams as of late 2013, the latest data available, with losses of more than $740 million.  The cyber crooks steal small business information to do things like rob bank accounts via wire transfers; steal customers’ personal identity information; file for fraudulent tax refunds; commit health insurance or Medicare fraud; or even steal intellectual property. The criminals can also hijack a small business’s website to cyberhack other small businesses. “There are probably 20 different ways a bad guy can get into a website” run by a small business, Scott Mann, CEO of Orlando-based Highforge Solutions, has said.



Quote for the day:


"A business of high principle attracts high-caliber people more easily, thereby gaining a basic competitive and profit edge."-- Marvin Bower


April 27, 2016

A History of Containerology and the Birth of Microservices

Not only has Microsoft jumped on the container bandwagon, but they also shared the vision of Docker’s application focused model for containers. Microsoft partnered with Docker, and as a result, one can run Linux or Windows containers with Docker. Being able to run applications in either Linux or Windows hosted containers will provide companies flexibility and reduce any refactoring costs associated with rewriting, tweaking or re-architecting existing applications. The bold new world that containerology will take us to is that of microservices. In my opinion, microservices (specifically as enabled by Docker) represent the first feasible step towards mechanized or industrialized applications. In the mechanical engineering world, complex systems were built buying off the shelf components and widgets. In contrast, the software world was accustomed to fabricating every part needed to built complex applications.


API security: Key takeaways from recent breaches

A good practice in approaching API security is first and foremost to know your API assets. An API management suite can help identify the API and exact version, whether in development, QA or production, tracked by its internal registry. This is instrumental in controlling API sprawl. And in the event of a breach, knowing the exact variables in play at the time of the breach will help to expedite the solution. A second detection strategy is knowing your consumers and solidifying their authentication. While most companies may start out exposing their APIs publicly, allowing developers to freely build applications using the APIs, it may help to configure multilayer security elements right down to the API level so that API consumers are easily identifiable. This is also crucial as API providers rely on standards such as OpenID for single sign-on between different applications.


5 years into the ‘cloud-first policy’ CIOs still struggling

The greatest challenge is not getting a contract in place, but what you find out is where those boundaries cross of who's now responsible because you're in a different infrastructure set-up, and what the cloud provider's going to do versus the contract staff, versus the application support staff versus the infrastructure staff," Andrews says. "So, that's the greatest challenge we're having now is defining roles and responsibilities and who's going to do what because the world has changed as we've known it, and we've been client-server for so many years that this is truly a different environment for us." Andrews recalls a recent meeting concerning the role of a cloud vendor and a somewhat tense discussion about "what does the word 'manage' mean in a cloud environment," and who has ownership over the systems and who bears responsibility for resolving the inevitable problems when they arise.


Third Generation Robo-Advisors Are Born

The application of machine learning to robo-advisory is still in inchoate stages, and only a few firms have stepped forward describing plans. Little-known Marstone (which focuses on business-to-business advice) has partnered with IBM Watson to deliver some form of cognitive-computing powered advice. It appears that Wealthfront will use artificial intelligence to provide more data-driven and personalized investment recommendations on its Dashboard. Personalization will be dynamic and driven by the client’s specific risk tolerance, financial profile, and investments as assessed across aggregated accounts. Machine learning in robo-advisory may also analyze, adapt to, and learn from investor behavior and correct for cognitive biases. As Wealthfront states, “observed behavior may reveal insights about ourselves that we aren’t even consciously aware of.”


Data Visualization Drives the Era of Information Activism

The information activism trend draws parallels to the printed word. From the invention of the Gutenberg printing press until the advent of the Internet, the ability to write and publish information was a highly technical skill, in the hands of a select few individuals. The arrival of blogging made the written word a mass activity, open to all. Similarly, people are now eager to express themselves using data visualization to tell engaging and visually stimulating stories without the need for a graphic artist or cartographer. They can just do it for themselves. ... Information activism is catalyzing a renaissance in the world of data, transforming the entire field of analytics. People no longer are mere data consumers, passively waiting for information.


MIT’s Teaching AI How to Help Stop Cyberattacks

A system called AI2, developed at MIT’s Computer Science and Artificial Intelligence Laboratory, reviews data from tens of millions of log lines each day and pinpoints anything suspicious. A human takes it from there, checking for signs of a breach. The one-two punch identifies 86 percent of attacks while sparing analysts the tedium of chasing bogus leads. ... Most of AI2‘s work helps a company determine what’s already happened to it can respond appropriately. The system highlights any typical signifiers of an attack. An extreme uptick in log-in attempts on an e-commerce site, for instance, might mean someone attempted a brute-force password attack. A sudden spike in devices connected to a single IP address suggests credential theft.


Backlash against a bimodal IT strategy

The big problem with a bimodal IT strategy is that it doesn't go far enough, according to the authors. Rather than face digital business head on, bimodal IT is a more staggered introduction, giving CIOs a chance to continue clinging to the security and the stability of tradition rather than fully accept the unpredictability and even the riskiness that come with going fast. "Yes, it's a big transition, but if you only do it partway, you're going to make it so much harder on yourself," Sharyn Leaver, Forrester analyst and an author of the report, said during a recent webinar. One of the consequences of going digital "partway" is that it introduces complexity. Divvying up IT tasks can result in two separate technology stacks and two separate teams that develop different value systems, different cultures and are evaluated on different metrics -- all of which CIOs will eventually have to untangle if they want to fully align with the business and move at a faster pace, according to Leaver.


What The Google I/O Schedule Tells Us About The Future Of Android

Google has big ambitions in virtual reality. Cardboard is just the start, as there have been rumors of the company building its own VR headset and indications from Android N about how the operating system will give more native support to VR. So set your eyes on the VR at Google session on May 19, which is hosted by Clay Bavor, Google’s vice president of virtual reality (who also has a fascinating photography blog). Right now Facebook-owned Oculus is leading the VR game and Google’s frenemy Samsung makes the most popular consumer device in the Gear VR. So expect Google to invest heavily to ensure the company’s services are where the Internet is going. YouTube, as an example, recently added support for VR and 360-degree video.


Will Healthcare Data Encryption be Impacted by NIST Guide?

NIST produced a development process for cryptographic standards and guidelines based on nine principles, which are transparency, openness, balance, integrity, technical merit, global acceptability, usability, continuous improvement, and innovation and intellectual property. Notably, NIST added the global acceptability principle to the final draft after public comments suggested that the organization address the global nature of the current economy and exchange of information. The final document reiterates NIST’s intentions to fostering collaborations with all stakeholders, such as security professionals, researchers, standard developing organizations, and users, to establish strong encryption standards and processes. Stakeholders who contribute to the development process are also part of a variety of industries, including healthcare, academia, and government.


Null Object Design Pattern in Automated Testing

In object-oriented computer programming, a Null Object is an object with no referenced value or with defined neutral ("null") behavior. The Null Object Design Pattern describes the uses of such objects and their behavior (or lack thereof). ... The main idea is that sometimes we need to add promotional codes and then assert that the correct amounts are displayed or saved in the DB. As you can assume, there are various ways to accomplish that. One way is to use the UI directly and assert the text is present in the labels. Another way is to use a direct access to the DB and insert the promotional code, then assert the calculated entries saved in some of the DB's tables.



Quote for the day:


"When you do the common things in life in an uncommon way, you will command the attention of the world." -- George W. Carver


April 26, 2016

What’s eating your lunch? A tale of strategy and culture

“We can’t do what you’re suggesting,” the head of sales shouted at one of his colleagues. “Product development will never deliver on time and we will be stuck with a financial target that there is no way we can meet! They screwed us over last year and we’ve been racing to close the gap for the last 10 months. Our sales teams are spent and frustrated!” These leaders were part of a company that had grown from a young startup, full of energy and fresh ideas, to a billion-dollar firm with thousands of employees. Today, it bears little resemblance to the firm they had all joined years before, and the leaders were experiencing the frustration of navigating a bureaucracy that they had to own a hand in creating.


Agile is Dead

Who said Agile is dead? The founders of Agile and its practitioners said it, not me. Don't go thinking I made this up. ... In the meantime when you say "Agile Software Development" everyone will know you are referring to just another methodology, one that failed to produce the promised results, one that was widely implemented inadequately, one no better than Waterfall or Spiral overall, one with certain relative strengths and even more weaknesses. 'No more magic dust. Several of the founders of Agile Software Development and many other influential developers have pronounced it dead. Only consultants and managers with a vested interest in the brand-name "Agile" still want it alive.


How a CIO can help the CEO drive business growth

CIOs are highly skilled using technical expertise to "keep the IT engine" working 24/7 while simultaneously using creative skills to facilitate the innovative use of new technologies for growth and customer engagement. CIOs need to embrace this dual role with importance emphasized on strategic business matters. In situations where the CEO and senior executive feel that their CIO is not sufficiently business-centric a new trend of engaging a chief digital officer (CDO) is emerging to accelerate the flow of digital benefits into the "front office" or customer facing areas. This may not be necessary, if CIOs can redefine their role as business leaders responsible for leveraging technology advancements for business growth, They should take an ‘outside-in’ approach to their business than the traditional ‘inside-out’ of approach.


Exclusive Q&A: IBM Security’s Marc van Zadelhoff 100 Days In

Customers are placing controls in place of security, but they’re missing the big picture of a Big Data security platform and a team, a SOC (security operation center) that leverages Big Data analytics — our QRadar platform — and has the ability to hunt for the attacker as opposed to looking at historical data. We’re enabling them to transform their security operations with forward and predictive analytics around attacks, compliance and insiders. I think this year will be the year of the SOC transformation that’s going to be driven by the increase in ransomware, the increase in high-value data theft like health care data. It’s ransomware, it’s the theft of high-value data, it’s the emergence of IoT (Internet of Things) and cloud — all these things mean you have to have a highly-analytical SOC in place, and that’s what we’re helping customers to do.


FBI Says It Will Ignore Court Order If Told To Reveal Its Tor Browser Exploit

There are a bunch of different cases going on right now concerning the FBI secretly running a hidden Tor-based child porn site called Playpen for two weeks, and then hacking the users of the site with malware in order to identify them. The courts, so far, have been fine with the FBI's overall actions of running the site, but there are increasing questions about how it hacked the users. In FBI lingo, they used a "network investigative technique" or a NIT to hack into those computers, but the FBI really doesn't want to talk about the details.  In one case, it was revealed that the warrant used by the FBI never mentions either hacking or malware, suggesting that the FBI actively misled the judge. In another one of the cases, a judge has declared the use of the NIT to be illegal searches, mainly based on jurisdictional questions.


Angular 2 and TypeScript - A High Level Overview

AngularJS is by far the most popular JavaScript framework available today for creating web applications. And now Angular 2 and TypeScript are bringing true object oriented web development to the mainstream, in a syntax that is strikingly close to Java 8. According to Google engineering director Brad Green, 1.3 million developers use AngularJS and 300 thousand are already using the soon to be released Angular 2. ... You can also develop Angular 2 apps in JavaScript (both ECMAScript 5 and 6) and in Dart. In addition, the Angular team integrated yet another Microsoft product - the RxJS library of reactive JavaScript extensions, into the Angular 2 framework. Angular 2 is not an MVC framework, but rather a component-based framework. In Angular 2 an application is a tree of loosely coupled components.


New regulatory environment demands CCOs become ‘compliance technologists’

As companies attempt to take a global approach to compliance, 48% of symposium attendees reported that their organizations take a centralized approach to cross-border regulations, meanwhile some have run into issues scaling the compliance function due to the fragmented nature of local regulations. More than a third of respondents said their firms preferred a regional set-up over a more centralized approach. Beyond the teams themselves, an often overlooked area that CCOs need to consider is how their technology systems will evolve and adapt across the enterprise, particularly as rules are increased or changed in multiple countries and jurisdictions.


SWIFT warns customers of multiple cyber fraud cases

Monday's statement from SWIFT marked the first acknowledgement that the Bangladesh Bank attack was not an isolated incident but one of several recent criminal schemes that aimed to take advantage of the global messaging platform used by some 11,000 financial institutions. "SWIFT is aware of a number of recent cyber incidents in which malicious insiders or external attackers have managed to submit SWIFT messages from financial institutions' back-offices, PCs or workstations connected to their local interface to the SWIFT network," the group warned customers on Monday in a notice seen by Reuters. The warning, which SWIFT issued in a confidential alert sent over its network, did not name any victims or disclose the value of any losses from the previously undisclosed attacks. SWIFT confirmed to Reuters the authenticity of the notice.


How to prepare your business to benefit from AI

Both the customer-centricity and the ability to act on the customer, asks a lot of these organizations. What we're seeing is that a lot of organizations are introducing chief digital officers or VPs of Digital who are responsible for the overarching customer experience or the overarching ability to understand that on the data. ... For artificial intelligence to be truly useful and truly holistic, it needs to be connected across all these different functions, and organizations are going to have to think a lot differently about how they want to deploy technology like this to be able to take advantage of it. Ultimately, most organizations today aren't really structured to take advantage of being truly customer-centric and having the ability to act on that understanding with algorithms or insights or machine learning and so forth.


Juniper's New 100-Gbps Firewall Is 'Absolutely Ridiculous -- In A Good Way'

"A 100 Gbps virtual firewall sounds absolutely ridiculous -- in a good way," said Dominic Grillo, executive vice president of Atrion Communications, a Branchburg, N.J.-based solution provider and longtime Juniper partner. "That's really impressive. You're seeing more people looking towards protecting things east-west [server-to-server] internally, so the more you can enable in that virtual environment, the better. A 100-Gbps [firewall] would be a great new asset for us." The new cSRX is a software-defined networking (SDN) controlled firewall providing advanced layer 4 to layer 7 microservices that Juniper says is the industry's fastest virtual firewall. CSRX includes content security, Juniper's application security suite and unified threat management for providing security as a service in large multi-tenant cloud networks.



Quote for the day:


"Leaders are visionaries with a poorly developed sense of fear and no concept of the odds against them." -- Robert Jarvik