September 06, 2015

C++ encapsulation for Data-Oriented Design: performance

To enable DOD for a particular class (like the particle we used in the previous entry), i.e., to distribute its different data members in separate memory locations, we change the class source code to turn it into a class template particle<Access> where Access is a framework-provided entity in charge of granting access to the external data members with a similar syntax as if they were an integral part of the class itself. Now,particle<Access> is no longer a regular class with value semantics, but a mere proxy to the external data without ownership to it. Importantly, it is the members and not theparticle objects that are stored: particles are constructed on the fly when needed to use its interface in order to process the data.


Why your big data strategy is a bust

"Hard." "Huge." "Dramatic." These are the words used to describe the kind of change required to truly become data-driven. Most companies simply aren’t up to the task. Have no fear, though: It’s the boss’s fault. If you ask business leaders to name their strategic challenges, “making fact-based business decisions based on data” tops the list (48 percent of respondents to the Forbes survey). But when you ask them how willing they are to trust that data, a less rational picture emerges. A Fortune survey of 720 senior business leaders that revealed that 62 percent tend to trust their gut rather than data, and 61 percent indicated real-world insight tops hard analytics when making decisions. In other words, the problem with truly embracing big data starts at the top.


The Hierarchy of IoT “Thing” Needs

Finally, since the “Thing” is a thing, it must meet a specific need or bring a value to be useful. As such, we must also account for its ability to meet functional expectations as a core existence requirement. Once the core physical needs of “Things” are met, and before external connectivity is possible, security is needed. To be quite clear: Security is key for IoT adoption, and thus needs to be addressed for individual “Things” that can be externally accessible. Accessibility does not mean just connectivity. It also applies to things that can be physically “cracked” open, where lack of security could put stored data at risk. ... This truth really has to be faced early on in the creation of each IoT device. Every “Thing” in IoT requires a means to encode, encrypt and authenticate its data.


10 Reasons Why Digital Transformation Initiatives Fail

So the good news is you have taken the plunge, recognised that you and your organisation need to embrace the digital present, hired an appropriate partner to help with your transformation and are ready to get started. The bad news is that your chances of success aren't great, but there are plenty of things to do to help move the process along. Unless you are incredibly naive and believe that your transformation partner is just going to 'do' everything for you--like an agency--then you will appreciate that this is about changing from within with a little help from the outside. As such it's down to you to do everything you can to make it work, so here are some things to watch out for.


Designing Your Network Infrastructure For Disaster Recovery

One common feedback we've heard consistently is about the need for a prescriptive guidance on how to design the network infrastructure for disaster recovery. This helps to guarantee the best possible RTO by bringing their replicated virtual machines located in either the secondary data center or Microsoft Azure. This whitepaper is directed to IT professionals who are responsible for architecting, implementing, and supporting business continuity and disaster recovery (BCDR) infrastructure, and who want to leverage Microsoft Azure Site Recovery (ASR) to support and enhance their BCDR services. This paper discusses practical considerations for System Center Virtual Machine Manager server deployment, the pros and cons of stretched subnets vs. subnet failover, and how to structure disaster recovery to virtual sites in Microsoft Azure.


Tablet shipments will fall by 15 percent this year

Anita Wang, TrendForce's notebook analyst, noted that "Tablets have yet to evolve beyond their main role as entertainment devices". However, Microsoft's Surface range "and Apple's upcoming 12.9-inch iPad change their functions depending on situations. They therefore can assist in the expansion of tablet applications by capturing a share of the business application market." Shipments of Microsoft Surfaces will grow from 1.5 million in the first half of 2015 to 2.6 million in the second half, according to TrendForce. It says: "The success of Surface 3 also proves that 2-in-1 PCs with better specs have the potential to expand into the business application market. Based on TrendForce's analysis, Microsoft's tablet shipments this year will soar 52 percent year on year and hit the four-million-unit mark."


Three Reasons Data Science Is The Job Factory Of Manufacturing

A Forbes review of numbers from Wanted Analytics showed manufacturers listed 15% of Big Data related job openings at mid-year, compared with "professional, scientific and technical services" (25%), Information Technologies (17%), finance and insurance (9%) and retail trade (8%). Put differently: outside of IT itself, manufacturing is hiring more data gurus than any other single industry. Big Data positions include data analysts and scientists, solution architects, data platform engineers and Linux/Java/Hadoop/SQL engineers. While pure size plays a role – one in six private-sector US employees works in manufacturing – there is no question that factories are punching above their historical weight when it comes to data.


Four ways your business can start using AI for automation

The Merriam-Webster dictionary defines artificial intelligence (AI) as "An area of computer science that deals with giving machines the ability to seem like they have human intelligence," and offers an alternate definition of "the power of a machine to copy intelligent human behavior." The first AI programs were developed in the 1950s and weren't commonly used in business. The big data analytics revolution has finally taken AI out of the halls of academia and research institutions, and has plugged it into commercial applications for business. Commercialized AI is still fundamentally new for many companies, though. The keys to business success with AI are to know how to use it and know what results to expect.


Why Big Data Alone Is An Inadequate Source Of Customer Intelligence

For many businesses, big data emerged in recent years with great expectations—that it could answer all questions about customer desires and behaviors. Today, however, many people believe big data alone can’t deliver what they want: actionable information with which they can make effective decisions that serve their customers and their bottom line. Many companies are trying to figure out what value big data can give them, and how to gather, mine and make sense of it. Unfortunately, big data can present several challenges within a company, which is why most big data projects fail. More importantly, however, smart companies are starting to figure out that big data alone isn’t a sufficient source of customer intelligence.


BGP for Humans: Making Sense of Border Gateway Protocol

You can think of an autonomous system in the computer world as a city with many streets. A network prefix is similar to one street with many houses. An IP address is like an address for a particular house in the real world, while a packet is the equivalent of a car travelling from one house to another using the best possible route. Taking this comparison to its logical conclusion, the BGP routing protocol is analogous to your trusty GPS navigator. Like Google's Waze application, the best route is determined by different factors, such as traffic congestion, roads temporarily closed for maintenance, etc. The path is calculated dynamically dependiing on the situation of the network nodes, which are like roads and junctions on a GPS map.



Quote for the day:

"The best minute you spend is the one you invest in people." -- Ken Blanchard

September 05, 2015

Finding a Single Version of Truth Within Big Data

“The ‘best’ data depends on its source and purpose,” Jonas writes. “While a company may have employee data in different systems, like IT, HR, Finance etc., the employee name and address maintained by the payroll system is probably the best one to use for tax filing.” That doesn’t mean Jonas thinks organizations should not try to reconcile data plurality. But instead of the traditional “merge-purge” technique that involves massive batch jobs that compare new data against the old data, Jonas thinks we are better of using an “entity resolution system.” “Entity resolution systems generally retain every record and attribute, each with its associated attribution,” he writes. “Because entity resolution systems have no data survivorship processing, there is no chance future relevant data will be prematurely discarded.”


Agile Introduction: Are You a Laggard?

While much has been written about the strengths and weaknesses of the technology, little data has been published to show how widely agile methods are used. This paper corrects that by providing data from our databases for public consumption. ... Some of these organizations are offshoots of the 120 firms and government organizations from which we have received data. Figure 2 summarizes which agile methodologies are in use by these organizations. As many said that they were using a hybrid approach, i.e., one that combined agile with traditional concepts, we have included their response and categorized them as either hybrid or hybrid/lean


Data Visualizations: 11 Ways To Bring Analytics To Life

"The ability to slice and dice has been around for a while. It's more exploratory now. You have a lot of data sources, so finding a needle in a haystack boils down to being interactive," said George Ramonov, founder and CTO at meeting planner provider Qurious.io, in an interview. "Now that we have cross-functional teams, it's important to be able to share visualizations embedded in a website or app to allow sharing without all the extra time of putting together an email." Regardless of how large or small an audience is, good data visualizations speed understanding. Bad visualizations cloud the issues. Here are six ways to best leverage the graphical presentation of data.


Building a Cyber-Resilient Business

Unfortunately, blocking four out of five attacks still leaves open the possibility that a substantial number of attacks might succeed. And today, it’s more a matter of when rather than if you will, eventually, be successfully attacked. What happens then?  Even well prepared companies may not know immediately that they have been breached. But those that have prepared for such an event will be much better off than those that have not. Just as conducting fire drills can save lives in the event of a real fire, preparing for the aftermath of a cyber attack can make an enormous difference in how quickly your company gets back on its feet and how well officers and board members do in the limelight after a major breach becomes public.


How DevOps fits into the modern network

If Company A builds protocol Cat, and Company B builds protocol Dog, how do they get those protocols to talk? They can't! It's just like someone who speaks Japanese and someone else who speaks English would need a translator to communicate effectively. By embracing open standards, we can make pieces of network equipment talk to each other, servers, laptops, phones, etc. If we didn't promote open standards, we would be locked into solutions where everything is controlled by a single vendor from A to B. We have many customers at Cumulus Networks that run multi-vendor environments, and open standards are not just encouraged, they're crucial.


A Guide to Lean Healthcare Workflows

It describes each step in-depth and includes techniques, example worksheets, and materials that can be used during the overall analysis and implementation process. And it provides insights that are derived from the real-world experience of the authors. This paper is intended to serve as a guide for readers during a process-improvement project and is not necessarily intended to be read end-to-end in one sitting. It is written primarily for clinical practitioners to use as a step-by-step guide to lean out clinical workflows without having to rely on complex statistical hypothesis-testing tools. This guide can also be used by clinical or nonclinical practitioners in non-patient-centered workflows. The steps are based on a universal Lean language that uses industry-standard terms and techniques and, therefore, can be applied to almost any process.


How Can Healthcare Big Data Analytics Bust Data Silos?

Machine data, meanwhile, is a record of actions that have already occurred, such as call logs or EHR access time stamps. This data is more or less static, and while it may recount the activities of users, it is automatically created by IT systems without much human intervention. When subjected to more sophisticated analysis, the millions of data points in machine data can help a healthcare organization identify a possible breach or chart how long a clinician takes to see her patients, and even aid understanding of how patients flow into the emergency room or how often a nurse updates vital signs. Correctly and efficiently analyzing these types of big data is key for clinical and business intelligence activities, and can help healthcare organizations understand how their IT infrastructure can enable workflow improvements


How to Manage Cloud Resources Wisely

The cloud isn’t perfect. There are still outages, challenges around replicating pieces of an environment, and even confusion around all the different kinds of services that cloud can provide today. Fortunately, the entire cloud model is becoming a bit easier to understand and deploy. Why? There are simply more use cases for such a powerful architecture. Businesses of all sizes are quickly realizing that their direct competitive advantage may very well revolve around the capabilities of the cloud. However, with that in mind, what should organizations of various sizes do about physical resource requirements? What about infrastructure expansion? Most of all, what are the limitations of your cloud?


How COSO Destroyed Risk Management

COSO’s failure is due primarily to its narrow focus on internal controls as a risk management tool. Internal controls should have been considered one leg of a four-pronged approach to a comprehensive risk management framework. Fundamentally, internal controls should be considered one of the foundational components of enterprise risk management. What is missing in COSO and broadly across risk management are the other tools needed to execute ERM. Risk management must include mechanisms to measure and quantify real risks. The rise of quantitative analysts is the recognition that risk management is measureable and not simply assessed through the qualitative assessments advocated in COSO.


How Different Team Topologies Influence DevOps Culture

It has become increasingly clear to me over the past few years working with many different organisations that the idea of a single, identifiable 'DevOps culture' is misplaced. What we've seen in organisations with effective or emerging DevOps practices is a variety of cultures to support DevOps. Of course, some aspects of these different cultures are essential: blameless incident most-mortems; team autonomy; respect for other people; and the desire and opportunity to improve continuously are all key components of a healthy DevOps culture. However, in some organisations certain teams collaborate much more than other teams, and the type and purpose of communication can be different to that in other organisations.



Quote for the day:

"Keep your fears to yourself, but share your courage with others." -- Robert Louis Stevenson

September 04, 2015

A degree in data science is in demand

The work of a data scientist is really two-fold. First, the data scientist must pull together all this data, which is often just a collection of garbled text or numbers, and clean it up to the point where it can be analyzed. Then, the data scientist has to know how to extract meaningful information from the cleaned-up data. “Big data represents one of the fastest growing areas of business, estimated to become a 17-trillion-dollar industry by 2020," wrote Becker College when it introduced its new data science program earlier this year.  Locally, Worcester Polytechnic Institute and Becker offer data science programs; both convinced that data science is already a desired career path for their students.
WPI's data science program is entering its second year; it currently offers a two-year, graduate-level degree in data science, and this fall, is adding a doctorate-level degree.


US Army’s Cyber War Strategy is Not Just for Military Use

Taking threat sensor data, removing noise and analyzing the data, will provide decision makers with the ability to forecast, gain up-to-date battle damage assessments (BDA) and supply geolocation information of the enemy and the electronic signatures our own forces generate. Convergence is going to be achieved by consolidating its cyber forces operating across multiple departments into single cross operational units removing impediments to information sharing. By fiscal year 2017, the U.S. Army Cyber Command (ARCYBER) will eventually have 41 Cyber Mission Forces operationally capable. They will combine cybersecurity, electronic warfare and signal doctrine into single units. The units will use past lessons learned to develop new doctrines in cyber security.


How Edge Data Center Providers are Changing the Internet’s Geography

Ultimately, location is the main way for companies like EdgeConneX to differentiate from the big colo players like Equinix or Interxion. Edge data center providers are essentially building in tier-two markets what Equinix and its rivals have built in the big core markets: hubs where all the players in the long chain of delivering content or services to customers interconnect and exchange traffic. These hubs are where most of the internet has lived and grown for the bulk of its existence, and edge data center companies are building smaller hubs in places that don’t already have them but are becoming increasingly bandwidth-hungry.


What Do Marketers Really Want in Data and Technology?

You may have heard of Data-as-a-Service (DaaS). Companies are touting DaaS as the next big thing and as a solution that gives marketers an “unfair competitive advantage.” By linking data with technology, DaaS is completely changing the game through a new model of fast-moving and real-time data acquisition. As the name implies, Data-as-a-Service begins with the data. Specifically, a company’s internal data, third party data, real-time fast data, and unique and hard-to-find data (HTFD) sourced from the Big Data ecosystem. With technology this data is structured to create insight into their best customers and ideal prospects. Real-time knowledge is also used to learn about who is actively in market for products and services, who is searching for competitors, or who is posting to social media for product recommendations.


Leveraging COBIT to Implement Information Security (Part 3)

In the context discussed here, it is envisaged that controls within the system are selected by management on a risk-assessed basis to address the perceived threats to the security of the organisation’s core business processes. Once selected, the ISMS is the basis for collecting evidence for operation and reviewing the efficacy of the implementation on an ongoing basis as part of the security forum. The forum is created by senior management, typically the chief executive officer (CEO), as a collaborative round table where managers from IT security, IT, human resources (HR) and major business functions can come together to make decisions on the basis of regular reporting from the system.


Disruptive tech and its impact on wireless protocols and networks

Internet of Things is not a new concept. It's been around for a long time. We used to call it telemetry or sensor-based computing. But the idea that we can do it today at a very low cost and that we can automate so many applications -- medical applications, security, energy management, all kinds of things like that -- means that there's going to be more and more happening on the network over time. And many of those applications will be mobile. (Not everything in IoT is mobile, but a lot of it will be.) So planning for that in terms of capacity, [security and cost is] made more complex. So, even though mobility opens up a lot of opportunities, it does come with a set of costs that we didn't have before.


Indoor positioning – Are we nearly there yet?

If the object you are locating and tracking happens to have a device with some unique identifier attached to it, like a tag or smart phone, things become significantly easier. Now you can have many fixed transmitters sending out pulses, getting received by the device that can then send out a “reply” rather than the reflected pulse that can also contain its unique identifier. The transmitters can be simple and omnidirectional, but then you need a few of them (remember each one defines a circle; in the plane, i.e., in 2D, at least 3 transmitters are needed to determine a unique position) – the determination of a location from measuring distances to a few fixed points is known as Trilateration (check out Multilateration while you’re at it).


Don’t Let Cyberattacks Take A ‘Byte’ Out Of Your Bottom Line

Should a data breach occur, having an incident response plan in place can help ease the pressure in the heat of the moment. Affected systems should immediately be closed off from the remainder of the company’s infrastructure in order to pinpoint the root cause. When a data breach does occur, use it as a learning experience, extracting as much information as possible about how and why the incident occurred. That information can then be used to strengthen IT infrastructure by plugging holes and establishing improved monitoring programs to detect threats. Reaction plans should be tested and updated regularly to ensure any future threat responses are as effective and efficient as possible.



Why Optimization and WANOP for Your Cloud Is Now Easier than Ever

We’re now pushing down rich content, a variety of applications, and a lot of new use cases. The reality here is that cloud will continue to grow as more users and verticals adopt this very versatile platform. In fact, global spending on IaaS is expected to reach almost $16.5 billion in 2015, an increase of 32.8 percent from 2014, with a compound annual growth rate (CAGR) from 2014 to 2019 forecast at 29.1 percent, according to Gartner. The report goes on to state that over time, as a business becomes more comfortable with the use of IaaS, organizations, especially in the midmarket, will eventually migrate away from running their own data centers in favor of relying primarily on infrastructure in the cloud.


Resiliency Testing Best Practices - Report

Every organization must put a plan in place for recover-ability after an outage, but testing your enterprise resilience without full business and IT validation is ineffective. Read the white paper to learn how to put a plan in place for full functional validation, and get details on the importance of validating resiliency in a live environment; learn why small-scale recovery “simulations” are inadequate and misleading; understand why validating resilience demands involvement from IT and the business; and get details on the checks and balances you need to maintain and validate business resilience.



Quote for the day:

"Let a man lose everything else in the world but his enthusiasm and he will come through again to success." -- H. W. Arnold

September 03, 2015

MySecureShell Documentation

MySecureShell is a solution which has been made to bring more features to sftp/scp protocol given by OpenSSH. By default, OpenSSH brings a lot of liberty to connected users which imply to thrust in your users. The goal of MySecureShell is to offer the power and security of OpenSSH, with enhanced features (like ACL) to restrict connected users. MySecureShell was created because of the lack of file transfer features in OpenSSH. OpenSSH was not designed as a file transfer solution, that’s why we made MySecureShell. MySecureShell is not a patch for OpenSSH, it’s a shell for users.


How big data is unfair

An immediate observation is that a learning algorithm is designed to pick up statistical patterns in training data. If the training data reflect existing social biases against a minority, the algorithm is likely to incorporate these biases. This can lead to less advantageous decisions for members of these minority groups. Some might object that the classifier couldn’t possibly be biased if nothing in the feature space speaks of the protected attributed, e.g., race. This argument is invalid. After all, the whole appeal of machine learning is that we can infer absent attributes from those that are present. Race and gender, for example, are typically redundantly encoded in any sufficiently rich feature space whether they are explicitly present or not.


Your Smartphone Can Tell If You’re Bored

While using machine learning to infer your state of mind is tricky, doing so reliably via your smartphone could be powerful. For instance, if an app were able to predict that you’re bored, and also knew where you were, it could try to feed you content it thinks you’d like in that particular context. Already at least one startup is trying to do something similar to this: Triggerhood, which built software that lets apps collect data about how the phone is being used, determines when is the best time to send you a notification (see “Smarter Smartphone Alerts Come in When You Want Them”).


Learning to Trust in the Cloud

After the prominent security breaches in retail and the public sector over the last year, it’s clear that a strong security posture is a requirement, not an option, as no one wants to be the next headline. Reviews of these breaches show that they were the result of internal policy or system failures, not the result of any weakness of a cloud service.  ... Security is a shared responsibility with your cloud provider, and companies should consider implementing tools such as next-generation and application firewalls, intrusion detection and prevention, anti-virus software, encryption, identity and access management, visibility, log and big data analytics. This can help ensure internal security standards are as high as those set by cloud providers.


TGIF(P) – Thank god it’s fried phish

Spoiler: It commonly means “Thank god it’s Friday” and probably many working people will be able to appreciate such a feeling. On the other hand, while many offices may close down for the weekend, it’s the time for bad guys to boost their activity because they count on the fact that they may go unnoticed for some time, at least until the upcoming Monday morning. The IT community is working hard to find and take down malicious sites as soon as possible, but then … the weekend is the weekend for many. What happened just last Friday may be a good example of such malicious weekend activity. We received the following email to one of our inboxes:


The Problem with Corporate Innovation

Corporate innovation faces challenges that entrepreneurs can’t fathom. Entrepreneurs often wish they had the people and resources that larger organizations do, without realizing that all those people and resources are already spoken for. Larger organizations lack the freedom and agility that smaller organizations have. Larger organizations are very slow to recognize and respond to major seismic shifts, so comfortable in their day to day operating models. Industry conventions become first defensive barriers and then comfortable blankets, reassuring large organizations that they understand what the customer needs and what the industry will do. Corporate executives face a really difficult challenge: on one hand they must meet the quarterly numbers, or heads will roll.


Data Center Consolidation: a Manager’s Checklist

The reality in today’s very competitive data center and cloud market is the one who can run most optimally and cost-effectively while still delivering prime services is a leader in the market. To accomplish this goal, there are a few things to consider. First of all, getting ahead doesn’t always mean adding more gear. Smart data center and cloud providers learn to use what they have and make the absolute most out of every resource. There are new kinds of questions being asked when it comes to new data center efficiency concepts. Is there a new technology coming out that improves density? Does the ROI help improve long-term management costs? Does a new kind of platform allow me to achieve more while requiring less?


Beth Israel Launches Big Data Effort To Improve ICU Care

The way clinical care is documented can vary greatly; for example hypertension, high blood pressure and elevated blood pressure are three different terms that describe the same condition. “There has been a lot of data cleanup that needed to be done, and in the process, we’ve learned a lot about structured data, and quality of data,” says Folcarelli. She says it took at least a year to normalize the data and determine the data points that would work well in the model. Statisticians and analysts worked with clinicians and nurses during this process. The hospital’s IT team uses scripts that extract data from the transactional systems—the HIS, the clinical ICU systems, the HR systems—on a regular basis. The extracts are sent to the hospital’s clinical data warehouse, which is built with Microsoft SQL Server technologies.


Your Next Car Could Reveal More About You Than Your Facebook Profile

About 90 percent of new vehicles in western Europe will be able to send and receive data by 2020, compared with roughly one-third next year, Hitachi Ltd. estimates. Once hooked into the web, the car’s driving data could be coupled with information as detailed as a driver’s contact list, favorite routes to work and even financial information from mobile-payment systems. As cars get closer to driving themselves, their cameras and sensors will collect data about what happens in and around the vehicle and what passengers are doing. That prospect has created disputes about what data can be collected and who needs to agree to it. Rules in this area could hamper automakers from fully tapping their newfound gold mine.


Blythe Masters Tells Banks the Blockchain Changes Everything

In a matter of months, this word, blockchain, has gone viral on trading floors and in the executive suites of banks and brokerages on both sides of the Atlantic. You can’t attend a finance conference these days without hearing it mentioned on a panel or at a reception or even in the loo. ... Now, everyone’s trying to figure out whether the blockchain is just so much hype or if Masters’s firm and other startups are really going to change the systems that process trillions of dollars in securities trades. When investors buy and sell syndicated loans or derivatives or move money around the world, they must cope with opaque and clunky back-office processes that rely on negotiated contracts between buyers and sellers, lots of phone calls, lots of lawyers, and even the occasional fax. It still takes almost 20 days, on average, to settle syndicated loan trades.



Quote for the day:

"Everybody wants to do something to help, but nobody wants to be the first." -- Pearl Bailey

September 02, 2015

5 IT experts reveal their Windows 10 upgrade strategies

There are support costs, management issues, security problems and a host of other deployment snafus that can crop up.  Yet, the new OS is a major step forward. Microsoft resolved many of the troubling usability issues that plagued Windows 8, such as a confusing “tile” interface and hard-to-find settings. Many features – including a more streamlined update process that won’t interfere as much with daily work – are designed for the enterprise. It’s even easier to do “in place” upgrades.  To help put the finishing touches on your upgrade strategy, CIO.com talked to several experts (including those at Microsoft) about how to make a deployment as smooth as possible. We asked about general guidelines, security issues, usability, training and other considerations for enterprise users. Here’s what we found out.


Data virtualization tools move into strategic IT realm

There [are some] use cases for data virtualization [instead of traditional data integration]. One is [if] it's a new source of data. You may need at some time later on to integrate the data but you want to get to the data now to analyze and look at it, see how useful it is, and you haven't gotten to the point where you can invest in getting it integrated. That's one use case scenario: the precursor of integrating it. There are plenty of other use cases where you never integrate the data with your source of data; you may not own the data. There's social media data, there's Web data, there's data that you might be exchanging between prospects, suppliers, partners and so on, that you may never own or have the ability or desire to integrate with your data.


Of Black Hat and security awareness

Black Hat is a combination of in-depth, mostly hands-on training and briefings that tend to be presentations on various security topics, typically with a focus on security weaknesses. I am interested in briefings in which the presenters demonstrate a successful hack or compromise of something very interesting or familiar. This year’s quintessential Black Hat presentation demonstrated the ability to remotely control connected-car functions. It’s the sort of thing that really sets Black Hat apart. Of course, Black Hat also has the obligatory expo floor, and I enjoyed the opportunity to obtain demos from technology vendors that I currently use or am considering. It’s much easier to ask pressing questions in a venue like this than to schedule individual meetings and then sit through a bunch of marketing slides before getting to the real substance.


Why Startups Should Leverage Compliance

Though this particular measure focuses on payments, the same dynamic can be seen at play in other innovative sectors. During the last several months, Uber has been battling regulators both here in the United States and in many countries abroad, often because of aggravation by taxi unions. What these incidents highlight is the unsurprising fact that if you want to eat the established players’ lunches, you probably have to take their pills too. Despite the many upstarts who decry the stifling effects of regulation, governments have signaled repeatedly that they have no intention of backing down. The proper response from the technology industry is not to bemoan the state of affairs, but to recognize the opportunity to leverage compliance against their competitors.


Lone Rangers of the Underground

The underground market for malware tools, vulnerabilities, exploit kits and every other criminal niche is fully mature. The barriers to entry into the market have fallen away over the years, established criminal toolkits are available at low to no cost, former high value malware such as ZeuS have become almost open source projects, spawning a variety of improvements or imitators and basic tools such as keyloggers or system lockers are being combined to devastating effect. Take for example the Hawkeye attacks that affected small businesses on a global scale, from China through India and Europe all the way across to the United States. A simple $35 keylogger, Hawkeye, was used in sophisticated “change of supplier” fraud by two lone Nigerian criminals.


Bank-in-a-box: An innovative, easily deployable solution

The bank-in-a-box is an integrated solution set that supports the transformation of core banking operations using a service provider or third party developed interface. It is scalable and cost-effective, and includes internet and mobile banking, deposit and loan products, payment solutions, ATM and POS switching, regulatory and MIS reporting. The software can easily be used by non-IT specialists to develop new products. The suite acts as a complete technology solution spanning across multiple delivery channels, between front- and back-office, including reconciliation and settlement. Typically, the hosted core banking platform (based on the SaaS model) is provided by the application service provider. This could take the form of cloud-based hosting or on-premise hosting services.


Why you need to convert IT consumers into investment partners

Several years ago, Joe Spagnoletti, CIO of Campbell Soup Company, brought an investment management approach to IT spend. Today, he and his business partners look at four characteristics when making IT investment decisions: business outcome, operating performance, cost to serve, and risk. "We’ve educated our business leaders about how to think of an IT investment more broadly," he says. "We show them how their current portfolio is performing so they think, 'In a silo, this one investment looks good, but how does it look as a part of a collection?'" Stephen Gold, EVP of business and technology operations, and CIO, CVS Health, employs the "CIO theory of reciprocity." "Let's say the head of sales of a given company suggests, 'If I had a real-time inventory management system, I could increase revenue by $500M,'" says Gold.


Metadata-Driven Design: Building Web APIs for Dynamic Mobile Apps

For the sake of brevity, it can be summarized as an approach to software design and implementation where metadata can constitute and integrate both phases of development. ... While building these apps on iOS and Android, I took note of the additional time that was inherent to their development on a native level, especially when compared to normal desktop applications. Besides the unquantifiable test of an app’s user interactivity, a significant amount of time was required to organize the application’s flow of navigation when using a more complex framework (like Cocoa). Of course, there was also the time needed to submit the app for approval and then the subsequent effort to modify and/or tailor any aspects considered undesirable by the app store’s vendor and/or the app’s users.


Why Israel dominates in cyber security

“Connecting the talent pool coming out of defense organizations with the strong entrepreneurial spirit that exists here, and you get the perfect ingredient for a powerhouse, in terms of cyber security startups and technology companies,” says Mimran. And that connection has been making strides in digital security for decades. For instance, In 1993, Tel Aviv-based Check Point developed FireWall-1, one of the very first protection solutions for Internet-connected computers. The defensive software was developed by Israeli-entrepreneur Gil Shwed, who served in the IDF’s Unit 8200—which is responsible for collecting signal intelligence—and grew the company into one of the country’s biggest tech giants. Check Point foresaw a need for protecting computer networks, and more importantly, filled that need before most people were even online.


Barclays Hacks Its Own Systems to Find Holes Before Criminals Do

Staying ahead of the bad guys requires resources, expertise and vigilance, and even that isn’t always enough. “They improve the ways to get in all the time,” said Oerting, 58. “The reality is that there are actually more cases than you read in the press.” Barclays is boosting spending by about 20 percent as part of its new cyber-defense strategy, Oerting said, declining to elaborate.  Cyber risk is viewed as a key concern by almost a third of banks in the U.K., a survey by the Bank of England found in July. Two years ago, only 1 percent of those surveyed considered cyber attack a major risk. HSBC Holdings Plc, Lloyds Banking Group Plc andRoyal Bank of Scotland Group Plc declined to discuss their efforts to fight computer crime.



Quote for the day:

"In order to succeed, your desire for success should be greater than your fear of failure." -- Bill Cosby

September 01, 2015

How Semantic Graph Techniques Ease Data Integration

Semantic Graph Databases are most valuable for complex metadata applications where the number of classes (i.e. types of objects) change daily, properties within classes change on-the-fly, and it is critical have self-descriptions of data. Grounded in formal logic, semantic analytics can easily encompass associative and contextual concepts for richer data analysis, which provide a more expansive, exploratory querying experience. As noted in David S. Frankel’s article, “How Semantics Can Take Graph Databases to New Levels,” querying a database using formal semantics provides the ability to “infer logical consequences from a set of asserted facts or axioms … Reasoners grounded in formal semantics can be potent tools when managing large graph databases.”


Intel says GPU malware is no reason to panic, yet

While it's true that there is a shortage of tools to analyze code running inside GPUs from a malware forensics perspective, endpoint security products don't need such capabilities because they can detect the other indicators left by such attacks on the system. On one hand, moving malicious code inside the GPU and removing it from the host system makes it harder for security products to detect attacks. But on the other, the detection surface is not completely eliminated and there are trace elements of malicious activity that can be identified, the researchers said. Some of the defenses built by Microsoft against kernel-level rootkits, such as Patch Guard, driver signing enforcement, Early Launch Anti-Malware (ELAM) and Secure Boot, can also help prevent the installation of GPU threats.


Breaking the SQL Barrier: Google BigQuery User-Defined Functions

BigQuery UDFs are similar to map functions in MapReduce. They take one row of input and produce zero or more rows of output, potentially with a different schema. ... BigQuery UDFs are functions with two formal parameters. The first parameter is a variable to which each input row will be bound. The second parameter is an “emitter” function. Each time the emitter is invoked with a JavaScript object, that object will be returned as a row to the query. ... JavaScript UDFs are executed on instances of Google V8 running on Google servers. Your code runs close to your data in order to minimize added latency. You don’t have to worry about provisioning hardware or managing pipelines to deal with data import / export.


Are you a data hoarder? Hadoop offers little choice

There's a bit of absurdity here. If you throw it away, you can't get it back; if you keep it, you can eventually organize and purge what you don't need. Those who store data now while getting their governance in place are not automatically "data hoarders." This is a false dilemma. The idea that you need to come up with a perfect plan before keeping any data or bringing in any new sources is a little like saying we need perfect social justice for everyone before we can address police killings of African-Americans. Instead, get started now. Stop throwing out the baby with the bathwater and begin finding your use cases. Meanwhile, make data the point rather than a side effect of your processes and govern it accordingly. These aren't "steps," but initiatives you need to undertake, usually in parallel.


New Smartphone Attempts to Finally Solve the Storage Problem

The startup is trying to take better advantage of the increasing ubiquity of wireless networks that most of us are already using. Apps, photos, videos, and music can pile up and take up available space on your phone, and Nextbit thinks the solution is to use the Internet to unobtrusively back up and remotely store some of that stuff. By default, the phone does this when it’s plugged in and connected to Wi-Fi, though users can change this. Robin is slated to be generally available online in January or February and will include 32 gigabytes of storage on the phone and another 68 online. It will cost $399, and Nextbit has already raised $18 million in venture funding from Accel Partners for the phone’s development. In an effort to publicize its brand with consumers and drum up early sales,


Revamping Master Data Management with Graphs

One of the more interesting aspects about utilizing graph databases with MDM is the role that Natural Language Processing (NLP) can play in the query process. Interestingly enough, the visual querying framework that Semantic graphs facilitate was described by Aasman as “even simpler than natural language”, especially because the former method does not involve code. Still, there are ways in which NLP can assist with the querying process for MDM systems augmented by graph databases. The most salient of these are when NLP is involved with certain definitions and descriptions of terms that are referred to with multiple spellings, nick names, and perhaps even slang. One of the most cogent examples of this fact is found in a use case in which Franz combined with Montefiore Medical Center to create a healthcare platform with instantaneous querying capabilities of vastly heterogeneous sources.


Six simple cybersecurity rules for all ages

Nowadays parents are getting more and more concerned about what you do on the Internet. They know that there are lots of creepy weirdos and malicious viruses on the Internet; they fear for your naivety, innocence and the potential of severe cyberbullying. Of course, sometimes they go overboard but you still need to deal with it. Do you have a smothering mother or father who wants to know what’s going on in your life both online and off? Sorry, but it’s just the way things are. If you want more freedom behave like any normal adult would do: show your parents that you can make deliberate decisions. You’ll benefit from it as well. Keeping your gaming and social accounts secured is a tangible bonus, isn’t it? As we’ve already written, cybercriminals would readily take over your Facebook page, infect your smartphone with a virus, or steal your gaming account.


New DOD cyber security regulation: is the cure worse than the disease?

In summary, this “interim rule” imposes on DOD contractors and subcontractors a contractual duty to provide “adequate security” from “unauthorized access and disclosure” for a broad array of unclassified information, including controlled technical information, export controlled information, critical information, and other information requiring protection by law, regulation or policy (protections for classified information continued to be provided for under the National Industrial Security Operating Manual (NISPOM)). The interim rule also requires DOD contractors and subcontractors to report directly to the appropriate DOD office a “cyber incident” or “malicious software.”


Latency, Bandwidth, Disaster Recovery: Selecting the Right Data Center

In selecting the right type of data center colocation, administrators must thoroughly plan out their deployment and strategies. This means involving more than just facilities teams in the planning stages. The process to select a good data center has to involve not only the physical elements of the facility but the workload to be delivered as well. ... With the increase of traffic moving through the internet, there is a greater demand for more bandwidth and less latency. As discussed earlier, it’s important to have your data reside closer to your users as well as the applications or workloads which are being accessed. Where data may have not fluctuated too much in the past, current demands are much different.


How PMOs can balance time, cost and quality

Triple constraint – the balancing act that occurs between cost, quality and time – is a term often heard in the world of project management, but what does that mean when it comes to the success or failure of a project to meet organizational objectives? Project managers are tasked with ensuring that they successfully manage the scope of a project to keep it within the cost, quality and time parameters determined by organizations at the onset. So how do project managers balance these three factors. This can be an ominous task, considering there are various internal or external factors that can rapidly change, causing any one or more of the three constraints to shift in an undesirable way. In order to decrease this risk, there are some questions you need to address in the beginning stages. Here six mportant ones that could have a significant impact on project scope.



Quote for the day:

"Continuous improvement is better than delayed perfection." -- Mark Twain

August 31, 2015

The Evolution of Cloud Connectivity

Today’s enterprise is a federation of companies with vast collections of dynamic services that are enabled/disabled frequently with ever-changing sets of authentication and access control. To survive in this environment, a modern enterprise needs to develop an intimate yet secure ecosystem of partners, suppliers and customers. So unlike the rudimentary connectivity case, the typical production application is composed of many dozens and perhaps hundreds of services, some internal to an enterprise and some residing in a collection of external cloud infrastructures or data centers. For example, the incredibly successful Amazon ecommerce website performs 100-150 internal service calls just to get data to build a personalized web experience.


Q&A on Scrum for Managers

Scrum teams deliver a working and tested result every Sprint. So they don't just deliver the end result after a year, but a small extra step every month or less. But remember, they only deliver results that are truly finished! This expands your ability to steer so greatly that using a process of control loses much of its importance. Therefore, this can, for the most part, be done by the Scrum teams themselves. At the same time, it's essential that teams continuously improve. This is the critically important role that the manager plays: Helping the team improve by removing obstacles for them. You help create an environment that the Scrum team can work in. This means you should hold yourself back from intervening too much.


Effective Navigation in Pair Programming

A trap many well-meaning but less experienced navigators fall into fairly often is to offer up advice as soon as that happens. Good navigators know when to wait a little bit before pointing out a missing semicolon somewhere, and will do it when there’s a natural pause in the driving. A very large number of interruptions rising from unfamiliarity of the driver might be a good indication that it’s time to swap roles, even if for a very short amount of time. For all the more interesting and more abstract issues, though, an experienced navigator is good at communicating intent – the what, not the how, and uses inclusive language (“us” and “we”, rather than “I” or “you”) as much as possible while at it, so the driver is invited to revisit some of the motivations behind intents they might not necessarily agree on.


Don’t Dive In: Knowing the Costs of the Cloud

Again, the majority of organisations cannot say for sure that adopting cloud services will result in savings because they don't know how much those services cost to run in-house. At best, the organisation might know the total cost of IT infrastructure, software and skills, and be able to roughly split that between the services provided. The decision to move to the cloud will therefore be based on this estimate; yet this ignores the fact both that the real costs are far more complex and that, by moving services to the cloud, you are not removing costs, but changing them. For instance, an organisation with its CRM service presently hosted in a data centre may decide to move CRM to the cloud.


Reaching App Nirvana with IoT

While there isn’t much difference in how you develop the application itself, supporting IoT devices does require that software engineers become proficient with device-level application programming interfaces (APIs). IoT integration is all about APIs, the logical connectors that allow applications to communicate with each manufacturer’s IoT devices. APIs expose data that enables those devices to transmit data to your applications, acting as a data interface. Or, they can allow your application to control the device and serve as a function interface. While device manufacturers are taking steps to ensure that their APIs are well defined, developers must learn how to use IoT device interfaces effectively. Fortunately, third-party providers are also producing tools that make using each IoT device manufacturer’s APIs easier for developers.


Platform business model enters online food ordering fray

Ongoing innovation and continuous feature updates are hallmarks of the platform business model. In The Cookie Dining case, the platform is expanding on a number of fronts. A feedback manager, which will let customers rate their food and delivery experience, is scheduled for release by the end of September. Integration with Yelp, which posts customer reviews of restaurants and other businesses, is also slated for September. Cookie is also at work on a point-of-sale (POS) system for in-store sales, Manojlovic said. Cookie POS v1.0 should be available for beta testing in December, he noted, adding that the idea is to unify "the whole sales experience for the restaurant."


Is the Internet of Things creating new software vendors?

The issue is that many makers of "things" still apply a traditional "box" mentality to products and do not consider the extra revenue opportunities of licensing-controlled embedded software and applications. Most of these companies are first-time software providers, mainly device manufacturers and OEMs that can now monetise their software as well as the devices via the IoT. For these companies, the IoT represents a significant market opportunity. “By monetising the software on their devices, these vendors will be able to increase and drive recurring revenue streams, creating billions of dollars of additional value,” Wurster adds. ... For the foreseeable future, Wurster believes the IoT will drive business transformation for many device manufacturers, enabling them to use software on the device to differentiate product and solution offerings.


The Algebra of Data Says “Hello World”

The point about data algebra is that it genuinely represents data in a software compatible manner – any data. There is a back story to why this algebra was created. It was not a small effort, and it was years in gestation. In fact, Algebraix Data Corporation, founded by software engineers who believed a mathematical approach to data was possible, spent over six years creating, enriching and proving data algebra’s applicability. This was an extensive research activity that primarily involved using data algebra directly in a variety of data management activities: defining data, organizing data, querying data and optimizing the queries for performance. This was the focus of the research partly because it was decided that the best area to prove data algebra was in using it to manipulate and transform data in applications that did little else: database optimizers for data in both tables and graphs.


To Avoid Poor Data Quality, Start with a Business Question

When the first system of record’s data meets the organization’s quality standards for that data type, the organization should build a real-time data quality firewall around it. With a data quality firewall, no matter where the data is coming from (online customers, a merchant, etc.), the firewall intercepts the data, cleanses it, and only then allows the data to enter the system of record. ... Profiling is both a technical challenge and management challenge. Questions will remain: How much more money should we dedicate to cleansing data? When is it clean enough? What return on investment do we need to make this particular cleansing process worthwhile? Again, these are strategic questions for the organization to evaluate as they weigh the importance of data sets.


FDIC on Why Banks Need a Disaster Plan for Cyber Threats

"We have always expected business continuity and disaster recovery considerations to be incorporated in an institution's business model," the report states. "However, in addition to preparing for natural disasters and other physical threats, continuity now also means preserving access to customer data and the integrity and security of that data in the face of cyber-attacks." That's why FDIC says it "encourages banks to practice responses to cyber-risk as part of their regular disaster-planning and business-continuity exercises." The FDIC suggests that community bank directors use the cyber challenge program to openly discuss operational risks with their peers and employees and review the potential impact of cyber-attacks and other technology disruptions on their customers and operations.



Quote for the day:

"Reduce the layers of management.They put distance between the top of an organization and the customers." -- Donald Rumsfeld

August 30, 2015

Will stock market instability incite IT strain?

"Even the deals that do come will be smaller," wrote Ray Hennessey, editorial director of Entrepreneur.com. "Private-company valuations generally follow public-company ones. … If tech companies on the Nasdaq suffer a Black Monday, it will be a Grey Tuesday for private companies seeking venture money." Another factor that may impact businesses in the wake of this week's stock market instability is that in times of market uncertainty, people tend to cut back their spending, according to Hennessey. The connection between those factors and IT budgets? If your company is in the midst of raising funds and has to pay more to borrow money, it ends up in a price war with competitors; if your customers start curbing spending, cuts to company spending could be made, and it might be your 2016 IT budget that's on the list.


Mirantis CEO Says OpenStack Is Getting Real

Increasingly, OpenStack is brought in for “onboarding a first software initiative or a particular business unit,” he said. “We see fewer and fewer people doing just experiments.” That’s not to say OpenStack has taken the world by storm. “Big rollouts require some serious spine from executives,” Ionel said, noting that OpenStack implementation is far from “frictionless.” The complexity of the framework is why Intel spearheaded the $100 million funding that Mirantis announced earlier this week — a follow-up to the other $100 million round announced last year. Intel wants to make OpenStack easier for the everyday enterprise to adopt, and it plans to collaborate with Mirantis on the necessary engineering.


Programming and prejudice: Computer scientists discover how to find bias in algorithms

Many companies have been using algorithms in software programs to help filter out job applicants in the hiring process, typically because it can be overwhelming to sort through the applications manually if many apply for the same job. A program can do that instead by scanning resumes and searching for keywords or numbers (such as school grade point averages) and then assigning an overall score to the applicant. These programs also can learn as they analyze more data. Known as machine-learning algorithms, they can change and adapt like humans so they can better predict outcomes. Amazon uses similar algorithms so they can learn the buying habits of customers or more accurately target ads, and Netflix uses them so they can learn the movie tastes of users when recommending new viewing choices.


Big Data and the Future of Business

The point of Big Data is that we can do novel things. One of the most promising ways the data is being put to use is in an area called “machine learning.” It is a branch of artificial intelligence, which is a branch of computer science—but with a healthy dose of math. The idea, simply, is to throw a lot of data at a computer and have it identify patterns that humans wouldn’t see, or make decisions based on probabilities at a scale that humans can do well but machines couldn’t until now, or perhaps someday at a scale that humans can never attain. It’s basically a way of getting a computer to do things not by explicitly teaching it what to do, but having the machine figure things out for itself based on massive quantities of information.


How to create a physical space for innovation

Every company embracing innovation does so in its own way. Johnson & Johnson, for instance, maintains several “innovation hubs” around the world, while Eli Lilly has endowed its own venture capital fund to fuel innovation efforts. The single quality these and other companies share is that they have created physical spaces in which to nurture new ideas. If innovation is the application of unorthodox thinking to business opportunities, the innovation lab is where that thinking evolves into new products, services, process efficiencies, partnerships, or business models. The hallmark of the innovation lab is that it is a space set apart—sometimes even isolated—from the rest of the company.


Stephen Hawking's answer to a 40-year-old paradox about black holes

Two fertile decades of debate followed, giving rise along the way to entirely new conceptions of how the universe is built (black holes, it seems, are pretty fundamental components of it). As a new branch of physics called string theory found its feet, it turned out to be good at explaining the rules of order and disorder within the event horizon. And a consensus emerged that while his "Hawking radiation" story of evaporating black holes was correct, Dr Hawking's supposition about the loss of information was not. By 2004, he was forced to concede a bet on the outcome (the winner was to receive an encyclopedia, "from which information can be retrieved at will"). Information was saved. But how? It is that question that has preoccupied theorists, not least Dr Hawking himself, since then.


Biometrics: The password you cannot change

Turner suggested that biometrics should only be used as an authentication for local devices, which he said makes Apple's Touch ID unique and the "perfect way" of using biometrics. He said when a person's fingerprints are checked by the cryptographic chip on the Apple device, the information becomes linked to a person's Apple ID, but that information stays only on that particular device. According to Turner, this means if a person loses their Apple device, no one else can use the saved credentials from a different device. Turner made this observation in a discussion paper titled Consumerisation of biometrics will result in obsolescence, highlighting that most biometric deployments will "not be well executed, and the failures of these systems will impact the feasibility of biometrics as a means of authentication".


'Experiment, incubate, pivot': Making the most of disruptive technology

"The risk of being disrupted has never been higher and the time it takes for disruption to happen is shorter than ever," says Cox. "Organisations, therefore, need to be proactively disrupting themselves; challenging their business models, developing technology-enabled enhancements and alternatives to their products and services." ... In fact, such is the power of disruption that Richard Norris, head of IT and business change at Reliance Mutual Insurance Society Limited, says all CIOs must help their businesses to identify opportunities for innovation-led change. Norris implemented a digital innovation group at Reliance about three months ago. Drawing people from across the business, the learning group analyses how digital disruption can affect how services are taken to market



9 NoSQL Pioneers Who Modernized Data Management

Instead of precision with defined schemas, NoSQL pioneers sought an ability to handle information at high volume and high speed. Instead of getting one transaction exactly right, they wanted to deal with a million users at once. NoSQL offered the sort of approach that a Twitter or Facebook might appreciate. And, in fact, those organizations quickly became big NoSQL users. Avinash Lakshman at Facebook was a pioneer involved in the formation of two NoSQL systems, DynamoDB during a prior stint at Amazon, and Cassandra at Facebook. For companies with robust public-facing Internet operations – such as social media, financial institutions, and retailers – customer service is a primary business driver for deploying NoSQL systems.


3 takeaways for CIOs from Facebook CIO Tim Campos

“At Facebook, culture is everything and it’s an incredible timesaver,” Campos said. Culture allows Facebook to cut through bureaucracy, he said. Among the ways Facebook emphasizes its culture is through its now well-known posters that say things like: "Fail harder;" "Move fast and break things;" and, "What would you do if you weren’t afraid?" Facebook also reinforces its culture through storytelling, like the “will you resign” email example he shared with the audience. “It was an incredibly powerful message,” Campos explained. “Everybody at the company read this email and had the exact same takeaway and perspective that I did, they all thought it was immediately addressed to them.


Quote for the day:

"Successful people are interdependent, not independent" -- RichardWeylman

August 29, 2015

Automate the Boring But Essential Parts of Your Data Warehouse

Until recently, DWA was associated mostly with automating ETL development – such as generating SSIS packages in the Microsoft environment. Today, however, it covers all the major components of data warehousing from design, development and testing to deployment, operations and change management. It also covers advanced functionality like support for slowly changing dimensions and change data capture. In our experience, DWA delivers up to 80% improvements in the cost-effectiveness of building and running a data warehouse. And, just as important, DWA is far better aligned with modern agile development practices because it encourages a rapid, iterative approach to design.


Women give boards real depth

A recent meta-analytic study investigated the relationship between women on boards and performance, finding that women can make more of a difference in some countries. In countries with better shareholder protection, female board representation is positively related to profitability; in such contexts, greater gender diversity on boards ensures that the directors bring different knowledge, experience and values. In countries with greater gender parity, female board representation is positively linked to market performance. This relationship between female directorships and market performance is negative in countries with low gender parity.


7 real NASA technologies in sci-fi movie The Martian

The movie, which will be released Oct. 2, merges science fiction with actual science about Mars, technology that NASA is working on and the space agency's plans to send astronauts to the Red Planet in the 2030s. Jim Adams, NASA's deputy chief technologist, who has read the book, said he was impressed with the way the author represented the science and means of survival on Mars. "It stimulated a lot of my thinking about what we are doing and our plans on getting to Mars in the 2030s with humans." According to scientists at NASA, they already are developing many of the technologies that appear in the film. Here's a look at some of them.


The Impacts Of Big Data That You May Not Have Heard Of

Historically, data was used as an ancillary to core business and was gathered for specific purposes. Retailers recorded sales for accounting. Manufacturers recorded raw materials for quality management. The number of mouse clicks on advertising banners was collected for calculating advertisement revenue. But as the demand for Big Data analytics emerged, data no longer serves only its initial purpose. Companies able to access huge amounts of data possess a valuable asset that when combined with the ability to analyze it, has created a whole new industry. ITA Software is a private company that gathers flight price data from almost all major carriers with the exception of Jet Blue and Southwest that sells that information to travel agents and websites.


What exactly is social engineering?

The problem is… that email wasn’t from your bank, and the link did not take you to your banking page. It took you to a fake website mimicking the real website’s look and feel, and you just gave the fraudsters the login details for your online banking. You did it because it looked real and you were scared that someone was going to take your money – but instead you walked straight into a trap. Sometimes the emails come with a phone number to call that lead you to an interactive voice system, just like your bank’s. You are asked to enter your bank account number and your sort code, and to divulge digits of your access code – little realising that you are giving this information straight to the criminals.


How Savvy Businesses Tackle Change Management

We spend hours looking at data about companies that are successful when they try to do major software implementation. The first thing is: it’s not for the faint of heart. There are so many things to consider. One of the interesting things we’re finding in the HR space, is more and more human capital departments, and HR departments actually running the software implementations. It used to be IT always bought the software. But now that Software as a Service (SaaS) provides different price point for things, we are seeing HR directors or professionals who are much more involved in running new HR software initiatives. Not only do we have more HR people, but we also have people who may not have actually ever managed an implementation before.


Why virtual reality could finally mend its broken promise

VR hasn't been completely dormant. "The folks who are just entering the field and are excited by the Oculus and the related technology product development are mistaken in thinking that what they're doing is new," said Linda Jacobson. Jacobson is the author of Garage Virtual Reality, a 1994 book outlining the past, present, and future of VR. She was one of the founding contributing editors of Wired Magazine, and a former virtual reality evangelist for Silicon Graphics. "What's new is this particular set of products at a new price point, as well as the availability of new people and new talent who are looking at it," she said. During those seemingly quiet years for VR, car manufacturers started using it to design cars and test user experience.


Agile Goal Setting with OKR - Objectives and Key Results

OKR (Objectives and Key Results) is a goal setting framework created by Intel and adopted by several Silicon Valley companies. Google is the most famous case, having adopted OKR in it's first year. Twitter, LinkedIn, Dropbox and Oracle are among other adopters. ... The main objective of OKR is to create alignment in the organization. In order to do so, transparency is key. OKRs are public to all company levels — everyone has access to everyone else's OKRs. All OKRs, including the CEO's, are usually available on the Intranet. OKRs exist to set clear priorities and to focus the organization. In order to to that, you should have few OKRs.


IT unions: The wrong approach to achieving a noble goal

This may sound like a "let the markets decide" argument against unionization, but rather than the markets, it's up to the individual IT worker to ensure his or her interests are well represented and accounted for. Neither an uncaring marketplace nor a collective-oriented union can fully represent one's individual, rational self-interest. While there may be a line of people waiting to take my position, I maintain control over my skills and qualifications, and I will happily say "no" to an unreasonable demand as long as my skills and capabilities are appropriate for my job, and my performance is more attractive than that of the nearest competitor.


Agile, TOGAF and Enterprise Architecture: Will They Blend?

Enterprise architecture provides an Agile project with a vision in the form of principles and models. Agile provides Enterprise Architecture with a good set of principles, showing that a multidisciplinary way of working is key. Also, we can learn from the success of Agile and Scrum. If you look at them as architectures, they can even help improve the enterprise architecture profession. Organizations do need to ask themselves whether all architects they currently have will remain relevant. Some of what architects currently do (this holds especially true for solution architects) is now the responsibility of Agile teams. So what is the impact of this from a training and consulting perspective? The first thing is that both enterprise architecture and Agile remain relevant and people and organizations will require training and consulting in both.



Quote for the day:

"It's not about how smart you are--it's about capturing minds." -- Richie Norton

August 28, 2015

Is a flat organization the key to IT agility?

The power of collective intelligence is that you get to these optimal solutions fast. When we first started holding these two-day sessions, the most common comment on the evaluations was, 'I cannot believe how much work we did in so short a period of time.' That's the function of having the network in the room. Nothing is as powerful as getting the whole system in the room because, as issues come up, you can say, how will this affect you? Even if the representatives are not the leaders of the group, it doesn't matter. As long as the voice is there, it seemed to work. By having them there, we could say, 'We can't stop until these four people are all comfortable with what we're going to do because all four people are impacted.' In hierarchies, you don't realize who is impacted until sometimes you're halfway through the project.


Why Big Data Alone Is An Inadequate Source Of Customer Intelligence

One reason that companies are unable to benefit fully from their investments in big data is that “management practices haven’t caught up with their technology platforms,” according to Ross and Quaadgras. For example, companies that have installed digital platforms, such as enterprise resource planning (ERP) systems and customer relationship management (CRM) systems over the past 10 to 15 years, haven’t yet taken full advantage of the information they make available. A cultural change is needed within companies so that “all decision makers have performance data at their fingertips every day,” Ross and Quaadgras write.


Culture impacts strategy and corporate governance

One key aspect of creating a conducive culture within an organisation is to be overseen by a board of directors that come from a diverse background. By introducing multiple perspectives in the mix dangers like ‘group think’, where one kinds of personality or way of looking at the world comes to rule the corporate culture, can be avoided. As a result, bringing in diversity, for instance wider female participation (with two thirds of companies actively seeking to introduce more women to the board), cultural diversity and other forms of diversity like social background, are growing in importance in boardrooms. In terms of female diversity, Eastern Europe comes out on top with nearly a quarter of executives being women, followed by Latin America.


Simon Wardley's 100-day Corporate get fit plan

Understanding context is key to applying these ideas but such situational awareness is a rarity in corporates. The lack of this causes visible symptoms such as poor communication, misapplication of doctrine (e.g. agile everywhere or six sigma everywhere), massive cost overruns in contracts, silos, duplication, constant reinventing of the wheel and a long list of other undesirable effects. I did want to write a post on the 61 different forms of strategic play and how to manipulate an economic environment but given the responses I've received from the Wardley mapping post, it seems something more basic is required.


IT must map its way to visibility

One action that I advocate for IT leaders is to create the technology maps that their enterprises will need to negotiate today’s marketplace. Modern executives should never be surprised by technology. They might be disappointed by technology. Frequently they should be ashamed at their ham-handed, small-minded attitudes toward the adoption and deployment of technology. Some should be flogged publicly for their bordering-on-malfeasance inability to make money with the technology cornucopia that defines modern existence. But they should never be surprised by technology. Technology futures are knowable. Technology futures and possible technology opportunities need to be mapped.


Taming today's cyberthreat landscape: A CIO checklist

Could this be true? What about the other 19%? Feeling a bit skeptical about what I was reading, I checked the research methodology, in particular, the demographics of the respondents: 814 IT security decision makers and practitioners, all from organizations with more than 500 employees. The respondents represented seven countries in North America and Europe and 19 industries. Seems pretty comprehensive. Another study performed earlier this year by Accenture titled Business Resilience in the Face of Cyber Risk, reported that: 66% of executives experience significant attacks on their IT systems on a daily or weekly basis; yet only 9% of executives run ongoing security penetration or continuity of business/disaster recovery tests on their systems.


The Looming Problem That Could Kill Bitcoin

Andresen’s gloomy prediction stems from the fact that Bitcoin can’t process more than seven transactions a second. That’s a tiny volume compared to the tens of thousands per second that payment systems like Visa can handle—and a limit he expects to start crippling Bitcoin early in 2016. It stems from the maximum size of the “blocks” that are added to the digital ledger of Bitcoin transactions, the blockchain, by people dubbed miners who run software that confirms Bitcoin transactions and creates new Bitcoin. Andresen’s proposed solution triggered an uproar among people who use or work with Bitcoin when he introduced it two weeks ago. Rather than continuing to work with the developers who maintain Bitcoin’s code,


Artificial Intelligence, Legal Responsibility And Civil Rights

What happens if an AI machine commits a crime? Who is responsible for the actions taken? This may sound like science fiction, but it has already happened. A Swiss art group created an automated shopping robot with the purpose of committing random Darknet purchases. The robot managed to purchase several items, including a Hungarian passport and some Ecstasy pills, before it was “arrested” by Swiss police. The aftermath resulted in no charges against the robot nor the artists behind the robot. How should an AI machine be regulated when it is acting on its own, outside the control of humans? There have already been several regulatory problems identified for controlling and regulating artificial intelligence.


6 SMB Data Security Myths and Misconceptions

Today, cybercrime costs companies more than $300 billion worldwide, and nearly all of it’s due to someone trying to steal credit cards, identity information, trade secrets, etc. Today’s hackers are all grown up and take the form of transnational organized crime rings, terrorist cells, hacking co-ops and groups and even nation-states and foreign intelligence services. According to Marc Goodman in Future Crimes, “The defender must build a perfect wall to keep out all intruders, while the offense need find only one chink in the armor through which to attack.” Make no mistake, these people are serious, they’re in it for the money, they’re organized and well-funded, they’re highly skilled, and they will find you.


Cyber security culture is a collective effort

The socialization of cyber threats among all levels of a company’s workforce reinforces the concept that cyber security is a shared endeavor. For example, social engineering and spearphishing e-mails that target one class of worker may not target another; yet it is imperative that everyone be cognizant of what they entail, how suspicious e-mails can be checked, and what should be done if they are received. This instills the knowledge that each employee has a vested interest in safeguarding the organization by ensuring its sensitive information and accesses are preserved and maintained.  It’s imperative that accountability and responsibility must not be viewed projected as burdens that punish employees or risk the impeding business operations for the sake of compliance.



Quote for the day:

“Ultimately, the only thing that matters is what we do for other people.” -- Daniel Vasella