September 10, 2015

Microsoft turns to Dell to push the Surface Pro into enterprise

"In some respects, it gives Microsoft a missing piece of what they needed to move more Surface Pro devices," countered Moorhead, sticking to the interpretation of a distribution deal. "There's a lot more to it than just having a really good product when you're dealing with a Fortune 500 company." Neither Microsoft nor Dell said whether the deal is an exclusive, but Moorhead believes it is, at least a time-constrained one, pointing out that Microsoft posted a video clip of Dell CEO Michael Dell promoting the partnership. "Innovation isn't just about great devices. It's about partnerships that bring together products, software and services to deliver extraordinary customer value," said Dell.


Cybercriminal Gang Extorts Businesses Via DDoS Attacks

"Your site is going under attack unless you pay 25 Bitcoin," one email stated. "Please note that it will not be easy to mitigate our attack, because our current UDP flood power is 400-500 Gbps, so don’t even bother." The email goes on to inform the target that a low-level DDoS attack was being launched against it to demonstrate the seriousness of the threat. The attackers promise never to threaten the victim again if the ransom is paid up: "We do bad things, but we keep our word." Subsequent emails warn the victim against ignoring the ransom demand. "And you are ignoring us. Probably because you don’t want to pay extortionists. And you believe that after sometime we will give up. But we never give up," the follow-up messages read.


6 Incredible Ways Big Data Is Used by the US Government

Big Data use in Government certainly presents big challenges – officials and politicians have a fine line to tread if they do not want to come across as attempting to implement a real-life version of Orwell’s Big Brother. It is certainly terrifying to think of the uses a modern-day Hitler or Stalin could find for the data and technology we have available today. After all, if the US Government can use it then so can any ruling administration – many of which are subject to even less regulation, and their citizens less free to scrutinize and hold them to account. However with the right balances in place – such as robust regulation and protection of “whistle blowers” – I believe it can be used for great positive social change - as demonstrated by the projects I’ve mentioned in this article.


How Big Data May Bring Some Sanity to the Holiday Shopping Rush

“The technology requirements are much greater than consumer travel,” Bob Mylod, former head of worldwide strategy and planning at Priceline, said in an interview last year. “In some ways, we’re talking about a different industry, but the transactional dynamic is the same.” Mylod is the managing partner of Annox Capital, which is an investor in Freightos. While the upstarts are leading the innovation race, industry giants aren't ignoring the trend. Deutsche Post has invested millions upgrading its freight-forwarding business, though a planned rollout of SAP software was shelved because of a negative impact on earnings. Flexport's chief executive officer, Ryan Petersen, says the company is using money from a recent $22 million funding round to boost head count.


Data Security: Hunkering Down at NYU Langone

Hospitals are merging, and depending on the size of the merger, they aren’t integrating technologies, more so creating interfaces only. It is hard to standardize across the board, and it is very difficult to implement. It takes a long time. I compare it to where I came from, NewYork-Presbyterian Hospital and University of Medicine and Dentistry of New Jersey. Every organization has pluses and minuses. IT is centralized at Langone, so that’s a plus. Other organizations are completely de-centralized or just starting to centralize their environments. This makes it easy for us to administrate the systems we have in place. Leadership is also relatively new. At some hospitals, people have been there for 30 or 40 years, and it’s hard to get that culture change. But at Langone, there is a security focus and it helps us get stuff done.


New Teams, New Tools — Why Compliance Must Collaborate

The trick is to find tools that meet these varying needs — and that work together in lockstep, creating a single integrated and user-driven framework. Compliance and legal must be able to utilize these tools without relying on IT, and they must be capable of retrieving and analyzing multiple sets of data previously stored in various locations under varying terms. Having a single, secure platform is preferable since it allows data to be gathered, consolidated and integrated across servers, systems and users — and also enables greater collaboration both within and across functions. The new solution is one that embraces new processes and best practices and breaks down the silos between Compliance, Legal and IT, thus fostering greater collaboration between the functions


Breaking Down Data Silos with Foreign Data Wrappers

The digital revolution is wreaking havoc on data management systems. The rapid growth of data has made it more difficult than ever for companies to store, manage and make sense of the information they collect. At the same time, as data becomes more varied, enterprises are not only harnessing massive amounts of structured data from a growing network of connected devices but also semi-structured and unstructured data as well. As a result, the need for solutions that can support multiple data types has turned the typical data center into a patchwork of data management technologies used to handle the volume, velocity and variety of big data. These include relational databases, standalone NoSQL solutions and specialized extensions to handle geographic data, to name a few.


A day in the life of a social CIO

Chou takes social seriously, but he doesn't feel the need to post, or check in, every hour. "People always believe that people who are extremely engaged in social are on it 24/7," he says. "If I have a five-minute or 10-minute gap. I will take a look at what's going on, I'll check my notifications, but I'm not constantly on my phone checking the social stream." ... During the past year, Chou started blogging, and he tries to regularly share ideas on LinkedIn's publishing platform. He enjoys the feedback he gets on industry-specific topics and leadership, and thinks it helps him grow personally. "Part of the reason [I blog on LinkedIn] is it forces me to really get deep into a topic and try to research it and learn as much as I can so I can really write about it," Chou says. "It's something to force myself to dig a few levels deeper on a topic of interest."


The Difference Between Business Intelligence and Real Data Science

BI requires concerned analysts to look at the data backwards, namely the historical data, and so their analysis is more retrospective. It demands the data to be absolutely accurate, since it is based on what actually occurred in the past. For example, the quarterly results of a company are generated from actual data reported for business done over the last three months. There is no scope for error as the reporting is descriptive, without being judgmental. With regard to data science, data scientists are required to make use of predictive and prescriptive analyses. They have to come up with reasonably accurate predictions about what must happen in the future, using probabilities and confidence levels.


SDN 101: An Introduction to Software Defined Networking

Part of the confusion that surrounds SDN is that many vendors don’t buy in totally to the ONF definition of SDN. For example, while some vendors are viewing OpenFlow as a foundational element of their SDN solutions, other vendors are taking a wait and see approach to OpenFlow. Another source of confusion is disagreement relative to what constitutes the infrastructure layer. To the ONF, the infrastructure layer is a broad range of physical and virtual switches and routers. As described below, one of the current approaches to implementing network virtualization relies on an architecture that looks similar to the one shown in Figure 1, but which only includes virtual switches and routers.



Quote for the day:

"Organizations are most vulnerable when they are at the peak of their success." -- R.T. Lenz

September 09, 2015

Cloudera Aims to Replace MapReduce With Spark as Default Hadoop Framework

Brandwein noted that there are at least 50 percent more active Spark projects than there are Hadoop projects. The One Platinum Initiative would in effect formalize what is already rapidly becoming a de facto standard approach to building analytics applications on Hadoop. “We want to unify Apache Spark and Hadoop,” he said. “We already have over 200 customers running Apache Spark on Hadoop.” ... The long-term goal, said Brandwein, is to make it possible for Spark jobs to scale simultaneously across multi-tenant clusters with over 10,000 nodes, which will require significant improvements in Spark reliability, stability, and performance.


2015 reality check: IT spending, confidence on an upswing

Unlike in years past, IT also has a better handle on organizational objectives, helping it focus on the initiatives most closely aligned to business needs as opposed to chasing new technologies for the sake of staying current. "We're seeing a better relationship between IT and the business -- they are engaging with IT on the front end and working as partners," notes Jason Hayman, research manager for TEKsystems, a provider of IT staffing solutions, IT talent management and IT services. "Because of that, there's no secret as to what the organizational priorities are. That gives IT comfort -- it's the devil you know." Read on for some statistics and insight on 2015 mid-year tech spending and IT hiring trends along with some perspective from IT leaders.


Wearables And Nanotech: The Future Of Healthcare

While applications and tools are enabling self-monitoring of health, more sophisticated devices and technologies are also capable of delivering the data generated to healthcare professionals, who can process it to predict and prevent bigger health concerns in the future. Wearable devices are playing a major role in transferring actionable data from patients to doctors and caregivers, even employers. As a recent example, Google X Lab has partnered with Novartis to design contact lenses that track glucose levels in the wearer’s tears and transfers that information to a mobile device that the doctor uses for monitoring. ... The way disruptive technologies are creating seismic shifts in the healthcare landscape, it’s not hard to predict that we are in the midst of a healthcare revolution that empowers us more than ever to manage our lives to perform better on the field, at work, or in our home.


TomTom Spark Hands On – Best Choice For Fitness Enthusiasts

Moving beyond the music feature, the TomTom Spark Cardio + Music is a solid fitness tracker. It has GPS capability for plotting your runs or cycles on a map and it accurately tracks your distance covered, steps taken, calories burned, minutes of activity and sleep. The watch picks up the intensity of your sessions too, thanks to the built-in heart rate sensor. Image wise, TomTom Spark Cardio + Music looks sporty. Sporty to extreme. The smartwatch is quite solid, with a monochrome display and there’s a large square on the strap under the screen, which houses the GPS unit. Deliberately placed there, as you tend to have that part of your wrist pointed skyward, the size of it helps with easy access and control even in the middle of the run; easy and quick navigation system, even with sweaty fingers.


Security vulnerability management more than patching, warns Secunia

“You cannot predict what products will be making your infrastructure vulnerable next month, based on what made it vulnerable this month,” said Kasper Lindgaard, Secunia director of research and security. “You should not assume patching the top 10 high-profile software names means you are all set and secure,” he said. According to Lindgaard, keeping track of what makes an IT environment vulnerable is an ongoing and complex task. “It requires a combination of vulnerability intelligence and visibility of applications, devices and business critical data in your systems,” he said.


Delivering Scalable, Maintainable Objects with Domain-Driven Design

In many ways a DTO used this way mimics what DDD calls an aggregate: a single object that encompasses several other objects and contains all the content for a single transaction. However, DDD also has some rules for creating aggregates that are designed to keep your applications simple, maintainable, responsive and scalable. As I discussed in an earlier column, these are the design goals for avoiding the CRAP cycle that leads to unmaintainable applications that have to be replaced rather than enhanced. For my example in this column, I'll use SalesOrder object used in a company's Billing system to calculate a sales order's price. In this column, I'm going to start filling in the details of that SalesOrder object in a way that meets the "rules of DDD aggregates."


Kaspersky And FireEye Security Products Cracked By Researchers

Tavis Ormandy, a security researcher at Google, made public the fact he had cracked Kaspersky’s anti-virus product before revealing the details to the Russian company. Ormandy has been criticized within the cybersecurity industry for his practice of disclosing vulnerabilities publicly rather than informing the company first and giving them time to fix the flaw. ... Los Angeles-based researcher Hermansen claims he has discovered at least four flaws within FireEye’s core security product -- revealing details of one and offering the other three for sale to the highest bidder. Hermansen posted details of how to trigger the remote file disclosure vulnerability as well as details of a file that is used to keep track of every registered user that has access to a particular system


Netflix thinks its customers are too dumb to download video

Amazon was the first major streaming-video provider to allow video downloads on iOS and Android devices, for Amazon Prime customers ($99 a year). Before that, there was really no affordable way to watch the movie or TV show of your choice while sitting in an airplane unless you went old school and purchased a DVD or digital download, or transferred saved content from a computer. "There's no doubt that the way people watch entertainment is changing — anytime, anywhere viewing is important," Michael Paull, vice president of digital video at Amazon, said this month in a press release announcing the service. Amazon Prime's full catalog is not available for download, but the selection is fairly large and likely to grow in the future.


Israel is number two in cybersecurity behind the U.S.

Israel is a nation where every citizen faces mandatory military service. Cybersecurity plays an increasingly important role in today's modern warfare. More and more Israeli military men and women are gaining experience with cybersecurity technology. This carries over to their post-military careers, and has led to a disproportionately high number of cybersecurity startups compared to other nations. ... In a recent VentureBeat article, Jerusalem Venture Partners(JVP) stated that the last couple of years have demonstrated that significant public companies are being created in the cyber-security space in Israel, from Imperva to Varonis to the most successful Israeli IPO of 2014 – CyberArk.


EBags adopts mobile-first strategy with innovation lab to drive growth

EBags’ Innovation Lab will consist of a team of people, spanning various locations, working to bring the best mobile tools and ideas to its consumers. The retailer came to this decision after seeing a growth of a 78 percent increase year-over-year through smartphone devices, and will now focus on developing practices for mobile first and then expanding to desktop from there. “The Innovation lab concept we have just launched is interesting because it is not a group of mad scientists in a physical lab,” said Peter Cobb, co-founder and executive vice president at eBags. “There is a team of people in India, Ukraine, Silicon Valley, and Denver all working together virtually on the latest innovative thinking and mobile-first strategies.



Quote for the day:

"It is not fair to ask of others what you are not willing to do yourself." -- Eleanor Roosevelt

September 08, 2015

Clouds ahead: What an IT career will look like five years out

IT pros who don't take the time to lift their heads and assess the likely IT landscape five years out may be asking for career trouble. ... "The IT department isn't going away, and the role of the CIO isn't going to be marginalized. But as more workloads shift to the cloud, the construction of the IT department, by necessity, must change away from traditional roles to those more focused on vendor, business, security, and service management," Quin says. "This doesn't mean that development and administration jobs go away, just that there are fewer of them." The jobs that remain, Quin says, will focus on what he calls the "shim" layer that integrates different public cloud services with a few applications that must remain in-house. These could include highly sensitive corporate (or scientific) data or medical records and images, for example.


The myth of the cybersecurity skills shortage

In any case, security positions are not entry-level positions, and if you treat them as such, you will have terrible security. The best security practitioners have experience in the technology and processes that they are supposed to secure. If you are not an experienced developer, you do not have the standing to tell people how to secure the code they write. If you have no experience as a system administrator, you cannot maintain the security of a system. If you have no experience as an administrator, you cannot secure a database. If you have no experience in designing a network, you cannot competently design a secure network.


Why We Should Continuously Break Everything

Continuous delivery teaches us that small, frequent changes are easier to manage, test, and fix than large, infrequent ones. In the words of Jez Humble, “…continuous delivery becomes even more important when you’re risk averse. Big-bang releases are horribly risky.” Continuous change may seem to cause continuous failure; counterintuitively, though, it actually reduces the overall cost of failure. Systems rot over time, even when they sit unchanged. Rot can arise due to human forgetfulness, or due to drift between a past decision and that decision’s appropriateness for current conditions. Exposing rot is no different than doing integration testing. The more you try to do at once, the more complex it is to understand and repair the problems you find.


A Cloud Foundry Story - Idea to Production in 90 Minutes

The immediate fix was to wipe the database, and this was easy as the application has a microservice architecture, and a specific service for adminsitrative functions called scaler, one of the 12 factors that make cloud native applications successful. So, once the database was wiped of any concerning images (using the /clear REST endpoint) and had newly collected a few innocuous images, the next step was to shut down collection of new images, so that the demo would stay somewhat static, but remain available. Again, because of the microservices architecture, it was a simple matter of stopping the watcher service with one simple command (cf stop watcher). From there were were safe to show the demo again, and I had time to play with solutions to the problem.


Larry Wall's programmer 'vices' as IT virtues

Most folks see impatience as impetuousness, but Wall's definition is about having a sense of drive. Put another way: "Patience is also synonymous with inaction," Brian said. As businesses embrace agile practices where speed and continuous improvement are standards, impatience might just become an advantage for IT organizations. And why shouldn't it? The business doesn't want to wait weeks and months for systems to be built and questions to be answered, nor should it have to. "I talk to a lot of teams who struggle with agility and, oftentimes, I think it has to do with delivery teams who don't fully appreciate the urgency of the job and the fact that things have to get done right now -- in the best possible way, of course," he said.


How to Overcome Toughest IoT Challenges

“Customers are skeptical when a solution provider they think of as their ‘HP VAR’ or ‘Microsoft reseller’ claims to be able to implement IOT solutions,” DeSarbo said. "IoT is where the cloud was five years ago; there’s a lot of hype and a lot of skepticism.” VARs can overcome that skepticism by clearly documenting and communicating the value and ROI every IoT solution they present to clients. “The absolute wrong thing to do is to say you’re into IoT without being able to back up the claim with actual implementations. There’s a lot of rolling of eyes now at IoT,” DeSarbo said. “Solution providers need to tell more stories about their applications that leveraged IoT. They have to tie those in with their real business impact economic value.”


Cyber security is now the biggest risk worrying Australian insurers

Concerns about technology were a key focus in Australia, with distribution channels, change management, and product development featured among the top seven ‘banana skins’. The pace of change is a source of anxiety for insurers concerned that existing business models can remain viable in the face of disruptive technology. “The industry is on the precipice of an enormous amount of change, largely being driven by digital innovation. The impact of wearable devices and connected cars will be significant – once consumers get a level of comfort around sharing data with insurers, the expectations of more personalised and customised products and premiums will quickly follow,” Fergusson said.


As the U.S. government faces cyberattack, 'there's no playbook' for fighting back

Robert Knake, former head of cybersecurity policy at the National Security Council, said those advocating for hacking back are overreacting. “It’s bad. But it’s not devastating,” said Knake of the confidential data exposed by the breach. “The reason it’s not devastating is that we know about it.” Speaking at a recent Atlantic Council panel debating the consequences of cyber revenge, Knake said identifying the breach offers the opportunity to mitigate the damage. Once armed with this knowledge, the government can use the hack to its advantage, he argued. For example, in the event that a nation uses information gleaned from the breach to identify Americans involved in sensitive activities, Knake said the U.S. could respond with misdirection by changing personnel.


How Many Types of KPIs Are There?

Performance measures reported in scorecards and dashboards is one of the core components of integrated enterprise and corporate performance management (EPM/CPM) rivaling in importance other improvement methods such as customer relationship management and managerial accounting. Regardless of the type of KPI, analytics (such as segmentation, correlation, regression, forecasting, and clustering) should ideally be imbedded in each method, and they are critical for employees to achieve and exceed KPI targets. ... Brett describes different types of KPIs in an article titled “Five Distinct Views of Scorecards – and Their Implications.” With Mr. Knowles permission, here they are abbreviated with my minor edits


Sylvia Isler on Migrating to and Operating Microservices

Microservices are an architectural technique. It is a tool that we have in our bag of tricks as software engineers and it is not necessarily a means to an end. If you have some bottleneck or if you have some set of algorithms that can be subdivided and provided as individual services to the consumers of those services, then it may make sense to decouple them from the monolith and deploy them as microservices. But if you have a monolithic architecture and you have not thought through how the components in that monolithic architecture are working together, the interfaces between them, then going to microservices without doing your homework and thinking through the data for all of your architecture, what the pain points are, the potential bottlenecks and the performance issues, then going to microservices may be a mistake.



Quote for the day:

"Just because we have the intelligence to stop every intrusion doesn't mean we should." -- Matthew Wong

September 07, 2015

4 new cybercrime trends threaten your business

Hackers aren't sending attachments to everyone, though. The difference in this reincarnation of a tried-and-true tactic is that cybercriminals are targeting businesses, and sometimes masking as requests or files coming from within the company. They’re even sending them at a time when you'd expect to receive such a missive. "We see the highest point of entry on Tuesday at 10 a.m. local time, when everyone is really busy," Epstein says.  Clay Calvert, director of cybersecurity for MetroStar Systems, says that hackers are often searching for the names of comptrollers or CFOs from company websites – typically available on "about us" pages – and then sending them emails pretending to be from a higher up in the company. They're the targets because they control the money.


Apple and Cisco partner to bolster iOS in the enterprise

"The corporate market is one in which the Apple brand still has a strong pull with employees. It also allows them to sustain a prime premium that has become a little harder to sustain in the consumer market" because of competition from more inexpensive Android devices, he said, adding that strong employee demand for the iPad, in particular, is a market Apple wants to preserve. Cisco, which in turn benefits from its association with a popular name brand, can help do that. "It is a good partnership for both companies and helps Apple gain more credibility in the enterprise," agreed Gartner's Baker.


Who Will Own the Robots?

Those who are inventing the technologies can play an important role in easing the effects. “Our way of thinking as engineers has always been about automation,” says Hod Lipson, the AI researcher. “We wanted to get machines to do as much work as possible. We always wanted to increase productivity; to solve engineering problems in the factory and other job-related challenges is to make things more productive. It never occurred to us that isn’t a good thing.” Now, suggests Lipson, engineers need to rethink their objectives. “The solution is not to hold back on innovation, but we have a new problem to innovate around: how do you keep people engaged when AI can do most things better than most people? I don’t know what the solution is, but it’s a new kind of grand challenge for engineers.”


Is IT service continuity only for the rich?

Start on your IT continuity plan by creating an asset database of the enterprise's applications. For most organizations, continuity doesn't mean mirroring all the same applications with the same user experience as the primary infrastructure. Instead, the business needs to be able to continue with core processes until the main data center is back on line. A mission critical application running on a physical server must continue operating despite an outage, but it may not need to be replicated as a physical system. Running the app as a virtual machine allows IT to spin up the image rapidly when needed and provide a good-enough user experience as a stop-gap measure. A workload that is not deemed mission critical, for example a payroll or purchasing program, may be disregarded during outages.


Connectedness for the mainframe in the application economy: blessing or curse?

While this is simply one vector into a system, it’s possible to create a product (or put it into an existing product such as CA Auditor for z/OS) that can scan for these vulnerabilities on a system, plug them and report on the number of times these attempts were blocked. Last but not least, such news about technical exploits helps, but there is a huge cultural and communication barrier for mainframe security professionals in getting the broader organization and the rest of the security community to understand the risk. There is still a culture of denial or, “Wait my mainframe has never been compromised.” This is why we believe the mainframe reframed discussion is a timely and thoughtful conversation we need to have as a community.


The Internet of Things comes to the NFL

"Every NFL stadium is connected to a command center here in San Jose," Stelfox says. "That command center has to operate as sort of a central command of all the data. When the data is collected in the stadium, it's sent in the stadium to the broadcaster in the stadium — it never leaves the stadium from a broadcaster perspective — but it's also distributed out to the NFL cloud." All that happens in under a couple of seconds. "The command center is our point of clarity," she says. "We can see every tag on every player from San Jose when the game is live. If there's something that goes wrong, we know about it very quickly and we have dual recovery. All of that is controlled from a single point of coverage in San Jose."


10 ways IT can use self service

Like user ID issuance and renewals, data retention is another area where policies are manually executed. Decisions on how long to keep accounting, HR, manufacturing, sales, and other data are made in separate meetings between IT and these areas' managers—and the meetings can be long and tedious. A self-service approach to data retention could eliminate these one-on-one meetings. IT would send out an annual update screen to each area end-user manager that lists the area's data resources and current data retention policies and ask managers to either sign off on existing policy to continue it or to make changes. This self-service update could then be sent to the IT data administrator. The transaction log from data retention reviews could be stored for auditors to review when they check on data governance.


Enterprise data architecture strategy and the big data lake

The data lake takes a fundamentally different approach to data storage than the conventional data acquisition and ingestion method. The traditional method seeks to make the data conform to a predefined data model to create a uniform data asset that is shared by all data consumers. By normalizing the data into a single defined format, this approach, called schema-on-write, can limit the ways the data can be analyzed downstream. The approach that is typically applied for data stored in a data lake is called schema-on-read, meaning there are no predefined constraints for how the data is stored, but that it is the consumer's responsibility to apply the rules for rendering the accessed data in a way that is suited to each user's needs.


Case study: How Ebury took a cloud-first approach to delivering financial services

“We’re very aggressive in terms of adding value as fast as possible to our customers, and we would experience friction with them if we weren’t able to quickly make the decisions we need to or we would fail fast in terms of trying things out if we were slowed down by having to provision additional servers and on-premise hardware,” he says. It is this kind of attitude to business agility that has shaped the firm’s cloud-first approach to IT, which has markedly accelerated since Young joined the firm a year ago. “When I joined, we had most of our kit running in Rackspace, but there was no cloud approach at all regarding the desktop or other applications that don’t necessarily sit in the datacentre,” he says.


Q&A on the Book Agile Impressions

There are so many ways of people working well together that it's easier to tell when they're not working well together. The most common symptom I see is what you asked about previously: does each party make themselves readily available to work on the other's issues? If not, they're not even working together, so they're clearly not working well together. Do they know each other's names? They don't need to be best buddies, but they must treat each other with respect. When they're meeting together, do most questions get answered? The answers don't have to be what the questioner wanted to hear, but are their questions responded to, not ignored? Those are the signs I see most often that two parties are not working well together.



Quote for the day:

"Leadership cannot just go along to get along. Leadership must meet the moral challenge of the day." -- Jesse Jackson

September 06, 2015

C++ encapsulation for Data-Oriented Design: performance

To enable DOD for a particular class (like the particle we used in the previous entry), i.e., to distribute its different data members in separate memory locations, we change the class source code to turn it into a class template particle<Access> where Access is a framework-provided entity in charge of granting access to the external data members with a similar syntax as if they were an integral part of the class itself. Now,particle<Access> is no longer a regular class with value semantics, but a mere proxy to the external data without ownership to it. Importantly, it is the members and not theparticle objects that are stored: particles are constructed on the fly when needed to use its interface in order to process the data.


Why your big data strategy is a bust

"Hard." "Huge." "Dramatic." These are the words used to describe the kind of change required to truly become data-driven. Most companies simply aren’t up to the task. Have no fear, though: It’s the boss’s fault. If you ask business leaders to name their strategic challenges, “making fact-based business decisions based on data” tops the list (48 percent of respondents to the Forbes survey). But when you ask them how willing they are to trust that data, a less rational picture emerges. A Fortune survey of 720 senior business leaders that revealed that 62 percent tend to trust their gut rather than data, and 61 percent indicated real-world insight tops hard analytics when making decisions. In other words, the problem with truly embracing big data starts at the top.


The Hierarchy of IoT “Thing” Needs

Finally, since the “Thing” is a thing, it must meet a specific need or bring a value to be useful. As such, we must also account for its ability to meet functional expectations as a core existence requirement. Once the core physical needs of “Things” are met, and before external connectivity is possible, security is needed. To be quite clear: Security is key for IoT adoption, and thus needs to be addressed for individual “Things” that can be externally accessible. Accessibility does not mean just connectivity. It also applies to things that can be physically “cracked” open, where lack of security could put stored data at risk. ... This truth really has to be faced early on in the creation of each IoT device. Every “Thing” in IoT requires a means to encode, encrypt and authenticate its data.


10 Reasons Why Digital Transformation Initiatives Fail

So the good news is you have taken the plunge, recognised that you and your organisation need to embrace the digital present, hired an appropriate partner to help with your transformation and are ready to get started. The bad news is that your chances of success aren't great, but there are plenty of things to do to help move the process along. Unless you are incredibly naive and believe that your transformation partner is just going to 'do' everything for you--like an agency--then you will appreciate that this is about changing from within with a little help from the outside. As such it's down to you to do everything you can to make it work, so here are some things to watch out for.


Designing Your Network Infrastructure For Disaster Recovery

One common feedback we've heard consistently is about the need for a prescriptive guidance on how to design the network infrastructure for disaster recovery. This helps to guarantee the best possible RTO by bringing their replicated virtual machines located in either the secondary data center or Microsoft Azure. This whitepaper is directed to IT professionals who are responsible for architecting, implementing, and supporting business continuity and disaster recovery (BCDR) infrastructure, and who want to leverage Microsoft Azure Site Recovery (ASR) to support and enhance their BCDR services. This paper discusses practical considerations for System Center Virtual Machine Manager server deployment, the pros and cons of stretched subnets vs. subnet failover, and how to structure disaster recovery to virtual sites in Microsoft Azure.


Tablet shipments will fall by 15 percent this year

Anita Wang, TrendForce's notebook analyst, noted that "Tablets have yet to evolve beyond their main role as entertainment devices". However, Microsoft's Surface range "and Apple's upcoming 12.9-inch iPad change their functions depending on situations. They therefore can assist in the expansion of tablet applications by capturing a share of the business application market." Shipments of Microsoft Surfaces will grow from 1.5 million in the first half of 2015 to 2.6 million in the second half, according to TrendForce. It says: "The success of Surface 3 also proves that 2-in-1 PCs with better specs have the potential to expand into the business application market. Based on TrendForce's analysis, Microsoft's tablet shipments this year will soar 52 percent year on year and hit the four-million-unit mark."


Three Reasons Data Science Is The Job Factory Of Manufacturing

A Forbes review of numbers from Wanted Analytics showed manufacturers listed 15% of Big Data related job openings at mid-year, compared with "professional, scientific and technical services" (25%), Information Technologies (17%), finance and insurance (9%) and retail trade (8%). Put differently: outside of IT itself, manufacturing is hiring more data gurus than any other single industry. Big Data positions include data analysts and scientists, solution architects, data platform engineers and Linux/Java/Hadoop/SQL engineers. While pure size plays a role – one in six private-sector US employees works in manufacturing – there is no question that factories are punching above their historical weight when it comes to data.


Four ways your business can start using AI for automation

The Merriam-Webster dictionary defines artificial intelligence (AI) as "An area of computer science that deals with giving machines the ability to seem like they have human intelligence," and offers an alternate definition of "the power of a machine to copy intelligent human behavior." The first AI programs were developed in the 1950s and weren't commonly used in business. The big data analytics revolution has finally taken AI out of the halls of academia and research institutions, and has plugged it into commercial applications for business. Commercialized AI is still fundamentally new for many companies, though. The keys to business success with AI are to know how to use it and know what results to expect.


Why Big Data Alone Is An Inadequate Source Of Customer Intelligence

For many businesses, big data emerged in recent years with great expectations—that it could answer all questions about customer desires and behaviors. Today, however, many people believe big data alone can’t deliver what they want: actionable information with which they can make effective decisions that serve their customers and their bottom line. Many companies are trying to figure out what value big data can give them, and how to gather, mine and make sense of it. Unfortunately, big data can present several challenges within a company, which is why most big data projects fail. More importantly, however, smart companies are starting to figure out that big data alone isn’t a sufficient source of customer intelligence.


BGP for Humans: Making Sense of Border Gateway Protocol

You can think of an autonomous system in the computer world as a city with many streets. A network prefix is similar to one street with many houses. An IP address is like an address for a particular house in the real world, while a packet is the equivalent of a car travelling from one house to another using the best possible route. Taking this comparison to its logical conclusion, the BGP routing protocol is analogous to your trusty GPS navigator. Like Google's Waze application, the best route is determined by different factors, such as traffic congestion, roads temporarily closed for maintenance, etc. The path is calculated dynamically dependiing on the situation of the network nodes, which are like roads and junctions on a GPS map.



Quote for the day:

"The best minute you spend is the one you invest in people." -- Ken Blanchard

September 05, 2015

Finding a Single Version of Truth Within Big Data

“The ‘best’ data depends on its source and purpose,” Jonas writes. “While a company may have employee data in different systems, like IT, HR, Finance etc., the employee name and address maintained by the payroll system is probably the best one to use for tax filing.” That doesn’t mean Jonas thinks organizations should not try to reconcile data plurality. But instead of the traditional “merge-purge” technique that involves massive batch jobs that compare new data against the old data, Jonas thinks we are better of using an “entity resolution system.” “Entity resolution systems generally retain every record and attribute, each with its associated attribution,” he writes. “Because entity resolution systems have no data survivorship processing, there is no chance future relevant data will be prematurely discarded.”


Agile Introduction: Are You a Laggard?

While much has been written about the strengths and weaknesses of the technology, little data has been published to show how widely agile methods are used. This paper corrects that by providing data from our databases for public consumption. ... Some of these organizations are offshoots of the 120 firms and government organizations from which we have received data. Figure 2 summarizes which agile methodologies are in use by these organizations. As many said that they were using a hybrid approach, i.e., one that combined agile with traditional concepts, we have included their response and categorized them as either hybrid or hybrid/lean


Data Visualizations: 11 Ways To Bring Analytics To Life

"The ability to slice and dice has been around for a while. It's more exploratory now. You have a lot of data sources, so finding a needle in a haystack boils down to being interactive," said George Ramonov, founder and CTO at meeting planner provider Qurious.io, in an interview. "Now that we have cross-functional teams, it's important to be able to share visualizations embedded in a website or app to allow sharing without all the extra time of putting together an email." Regardless of how large or small an audience is, good data visualizations speed understanding. Bad visualizations cloud the issues. Here are six ways to best leverage the graphical presentation of data.


Building a Cyber-Resilient Business

Unfortunately, blocking four out of five attacks still leaves open the possibility that a substantial number of attacks might succeed. And today, it’s more a matter of when rather than if you will, eventually, be successfully attacked. What happens then?  Even well prepared companies may not know immediately that they have been breached. But those that have prepared for such an event will be much better off than those that have not. Just as conducting fire drills can save lives in the event of a real fire, preparing for the aftermath of a cyber attack can make an enormous difference in how quickly your company gets back on its feet and how well officers and board members do in the limelight after a major breach becomes public.


How DevOps fits into the modern network

If Company A builds protocol Cat, and Company B builds protocol Dog, how do they get those protocols to talk? They can't! It's just like someone who speaks Japanese and someone else who speaks English would need a translator to communicate effectively. By embracing open standards, we can make pieces of network equipment talk to each other, servers, laptops, phones, etc. If we didn't promote open standards, we would be locked into solutions where everything is controlled by a single vendor from A to B. We have many customers at Cumulus Networks that run multi-vendor environments, and open standards are not just encouraged, they're crucial.


A Guide to Lean Healthcare Workflows

It describes each step in-depth and includes techniques, example worksheets, and materials that can be used during the overall analysis and implementation process. And it provides insights that are derived from the real-world experience of the authors. This paper is intended to serve as a guide for readers during a process-improvement project and is not necessarily intended to be read end-to-end in one sitting. It is written primarily for clinical practitioners to use as a step-by-step guide to lean out clinical workflows without having to rely on complex statistical hypothesis-testing tools. This guide can also be used by clinical or nonclinical practitioners in non-patient-centered workflows. The steps are based on a universal Lean language that uses industry-standard terms and techniques and, therefore, can be applied to almost any process.


How Can Healthcare Big Data Analytics Bust Data Silos?

Machine data, meanwhile, is a record of actions that have already occurred, such as call logs or EHR access time stamps. This data is more or less static, and while it may recount the activities of users, it is automatically created by IT systems without much human intervention. When subjected to more sophisticated analysis, the millions of data points in machine data can help a healthcare organization identify a possible breach or chart how long a clinician takes to see her patients, and even aid understanding of how patients flow into the emergency room or how often a nurse updates vital signs. Correctly and efficiently analyzing these types of big data is key for clinical and business intelligence activities, and can help healthcare organizations understand how their IT infrastructure can enable workflow improvements


How to Manage Cloud Resources Wisely

The cloud isn’t perfect. There are still outages, challenges around replicating pieces of an environment, and even confusion around all the different kinds of services that cloud can provide today. Fortunately, the entire cloud model is becoming a bit easier to understand and deploy. Why? There are simply more use cases for such a powerful architecture. Businesses of all sizes are quickly realizing that their direct competitive advantage may very well revolve around the capabilities of the cloud. However, with that in mind, what should organizations of various sizes do about physical resource requirements? What about infrastructure expansion? Most of all, what are the limitations of your cloud?


How COSO Destroyed Risk Management

COSO’s failure is due primarily to its narrow focus on internal controls as a risk management tool. Internal controls should have been considered one leg of a four-pronged approach to a comprehensive risk management framework. Fundamentally, internal controls should be considered one of the foundational components of enterprise risk management. What is missing in COSO and broadly across risk management are the other tools needed to execute ERM. Risk management must include mechanisms to measure and quantify real risks. The rise of quantitative analysts is the recognition that risk management is measureable and not simply assessed through the qualitative assessments advocated in COSO.


How Different Team Topologies Influence DevOps Culture

It has become increasingly clear to me over the past few years working with many different organisations that the idea of a single, identifiable 'DevOps culture' is misplaced. What we've seen in organisations with effective or emerging DevOps practices is a variety of cultures to support DevOps. Of course, some aspects of these different cultures are essential: blameless incident most-mortems; team autonomy; respect for other people; and the desire and opportunity to improve continuously are all key components of a healthy DevOps culture. However, in some organisations certain teams collaborate much more than other teams, and the type and purpose of communication can be different to that in other organisations.



Quote for the day:

"Keep your fears to yourself, but share your courage with others." -- Robert Louis Stevenson

September 04, 2015

A degree in data science is in demand

The work of a data scientist is really two-fold. First, the data scientist must pull together all this data, which is often just a collection of garbled text or numbers, and clean it up to the point where it can be analyzed. Then, the data scientist has to know how to extract meaningful information from the cleaned-up data. “Big data represents one of the fastest growing areas of business, estimated to become a 17-trillion-dollar industry by 2020," wrote Becker College when it introduced its new data science program earlier this year.  Locally, Worcester Polytechnic Institute and Becker offer data science programs; both convinced that data science is already a desired career path for their students.
WPI's data science program is entering its second year; it currently offers a two-year, graduate-level degree in data science, and this fall, is adding a doctorate-level degree.


US Army’s Cyber War Strategy is Not Just for Military Use

Taking threat sensor data, removing noise and analyzing the data, will provide decision makers with the ability to forecast, gain up-to-date battle damage assessments (BDA) and supply geolocation information of the enemy and the electronic signatures our own forces generate. Convergence is going to be achieved by consolidating its cyber forces operating across multiple departments into single cross operational units removing impediments to information sharing. By fiscal year 2017, the U.S. Army Cyber Command (ARCYBER) will eventually have 41 Cyber Mission Forces operationally capable. They will combine cybersecurity, electronic warfare and signal doctrine into single units. The units will use past lessons learned to develop new doctrines in cyber security.


How Edge Data Center Providers are Changing the Internet’s Geography

Ultimately, location is the main way for companies like EdgeConneX to differentiate from the big colo players like Equinix or Interxion. Edge data center providers are essentially building in tier-two markets what Equinix and its rivals have built in the big core markets: hubs where all the players in the long chain of delivering content or services to customers interconnect and exchange traffic. These hubs are where most of the internet has lived and grown for the bulk of its existence, and edge data center companies are building smaller hubs in places that don’t already have them but are becoming increasingly bandwidth-hungry.


What Do Marketers Really Want in Data and Technology?

You may have heard of Data-as-a-Service (DaaS). Companies are touting DaaS as the next big thing and as a solution that gives marketers an “unfair competitive advantage.” By linking data with technology, DaaS is completely changing the game through a new model of fast-moving and real-time data acquisition. As the name implies, Data-as-a-Service begins with the data. Specifically, a company’s internal data, third party data, real-time fast data, and unique and hard-to-find data (HTFD) sourced from the Big Data ecosystem. With technology this data is structured to create insight into their best customers and ideal prospects. Real-time knowledge is also used to learn about who is actively in market for products and services, who is searching for competitors, or who is posting to social media for product recommendations.


Leveraging COBIT to Implement Information Security (Part 3)

In the context discussed here, it is envisaged that controls within the system are selected by management on a risk-assessed basis to address the perceived threats to the security of the organisation’s core business processes. Once selected, the ISMS is the basis for collecting evidence for operation and reviewing the efficacy of the implementation on an ongoing basis as part of the security forum. The forum is created by senior management, typically the chief executive officer (CEO), as a collaborative round table where managers from IT security, IT, human resources (HR) and major business functions can come together to make decisions on the basis of regular reporting from the system.


Disruptive tech and its impact on wireless protocols and networks

Internet of Things is not a new concept. It's been around for a long time. We used to call it telemetry or sensor-based computing. But the idea that we can do it today at a very low cost and that we can automate so many applications -- medical applications, security, energy management, all kinds of things like that -- means that there's going to be more and more happening on the network over time. And many of those applications will be mobile. (Not everything in IoT is mobile, but a lot of it will be.) So planning for that in terms of capacity, [security and cost is] made more complex. So, even though mobility opens up a lot of opportunities, it does come with a set of costs that we didn't have before.


Indoor positioning – Are we nearly there yet?

If the object you are locating and tracking happens to have a device with some unique identifier attached to it, like a tag or smart phone, things become significantly easier. Now you can have many fixed transmitters sending out pulses, getting received by the device that can then send out a “reply” rather than the reflected pulse that can also contain its unique identifier. The transmitters can be simple and omnidirectional, but then you need a few of them (remember each one defines a circle; in the plane, i.e., in 2D, at least 3 transmitters are needed to determine a unique position) – the determination of a location from measuring distances to a few fixed points is known as Trilateration (check out Multilateration while you’re at it).


Don’t Let Cyberattacks Take A ‘Byte’ Out Of Your Bottom Line

Should a data breach occur, having an incident response plan in place can help ease the pressure in the heat of the moment. Affected systems should immediately be closed off from the remainder of the company’s infrastructure in order to pinpoint the root cause. When a data breach does occur, use it as a learning experience, extracting as much information as possible about how and why the incident occurred. That information can then be used to strengthen IT infrastructure by plugging holes and establishing improved monitoring programs to detect threats. Reaction plans should be tested and updated regularly to ensure any future threat responses are as effective and efficient as possible.



Why Optimization and WANOP for Your Cloud Is Now Easier than Ever

We’re now pushing down rich content, a variety of applications, and a lot of new use cases. The reality here is that cloud will continue to grow as more users and verticals adopt this very versatile platform. In fact, global spending on IaaS is expected to reach almost $16.5 billion in 2015, an increase of 32.8 percent from 2014, with a compound annual growth rate (CAGR) from 2014 to 2019 forecast at 29.1 percent, according to Gartner. The report goes on to state that over time, as a business becomes more comfortable with the use of IaaS, organizations, especially in the midmarket, will eventually migrate away from running their own data centers in favor of relying primarily on infrastructure in the cloud.


Resiliency Testing Best Practices - Report

Every organization must put a plan in place for recover-ability after an outage, but testing your enterprise resilience without full business and IT validation is ineffective. Read the white paper to learn how to put a plan in place for full functional validation, and get details on the importance of validating resiliency in a live environment; learn why small-scale recovery “simulations” are inadequate and misleading; understand why validating resilience demands involvement from IT and the business; and get details on the checks and balances you need to maintain and validate business resilience.



Quote for the day:

"Let a man lose everything else in the world but his enthusiasm and he will come through again to success." -- H. W. Arnold

September 03, 2015

MySecureShell Documentation

MySecureShell is a solution which has been made to bring more features to sftp/scp protocol given by OpenSSH. By default, OpenSSH brings a lot of liberty to connected users which imply to thrust in your users. The goal of MySecureShell is to offer the power and security of OpenSSH, with enhanced features (like ACL) to restrict connected users. MySecureShell was created because of the lack of file transfer features in OpenSSH. OpenSSH was not designed as a file transfer solution, that’s why we made MySecureShell. MySecureShell is not a patch for OpenSSH, it’s a shell for users.


How big data is unfair

An immediate observation is that a learning algorithm is designed to pick up statistical patterns in training data. If the training data reflect existing social biases against a minority, the algorithm is likely to incorporate these biases. This can lead to less advantageous decisions for members of these minority groups. Some might object that the classifier couldn’t possibly be biased if nothing in the feature space speaks of the protected attributed, e.g., race. This argument is invalid. After all, the whole appeal of machine learning is that we can infer absent attributes from those that are present. Race and gender, for example, are typically redundantly encoded in any sufficiently rich feature space whether they are explicitly present or not.


Your Smartphone Can Tell If You’re Bored

While using machine learning to infer your state of mind is tricky, doing so reliably via your smartphone could be powerful. For instance, if an app were able to predict that you’re bored, and also knew where you were, it could try to feed you content it thinks you’d like in that particular context. Already at least one startup is trying to do something similar to this: Triggerhood, which built software that lets apps collect data about how the phone is being used, determines when is the best time to send you a notification (see “Smarter Smartphone Alerts Come in When You Want Them”).


Learning to Trust in the Cloud

After the prominent security breaches in retail and the public sector over the last year, it’s clear that a strong security posture is a requirement, not an option, as no one wants to be the next headline. Reviews of these breaches show that they were the result of internal policy or system failures, not the result of any weakness of a cloud service.  ... Security is a shared responsibility with your cloud provider, and companies should consider implementing tools such as next-generation and application firewalls, intrusion detection and prevention, anti-virus software, encryption, identity and access management, visibility, log and big data analytics. This can help ensure internal security standards are as high as those set by cloud providers.


TGIF(P) – Thank god it’s fried phish

Spoiler: It commonly means “Thank god it’s Friday” and probably many working people will be able to appreciate such a feeling. On the other hand, while many offices may close down for the weekend, it’s the time for bad guys to boost their activity because they count on the fact that they may go unnoticed for some time, at least until the upcoming Monday morning. The IT community is working hard to find and take down malicious sites as soon as possible, but then … the weekend is the weekend for many. What happened just last Friday may be a good example of such malicious weekend activity. We received the following email to one of our inboxes:


The Problem with Corporate Innovation

Corporate innovation faces challenges that entrepreneurs can’t fathom. Entrepreneurs often wish they had the people and resources that larger organizations do, without realizing that all those people and resources are already spoken for. Larger organizations lack the freedom and agility that smaller organizations have. Larger organizations are very slow to recognize and respond to major seismic shifts, so comfortable in their day to day operating models. Industry conventions become first defensive barriers and then comfortable blankets, reassuring large organizations that they understand what the customer needs and what the industry will do. Corporate executives face a really difficult challenge: on one hand they must meet the quarterly numbers, or heads will roll.


Data Center Consolidation: a Manager’s Checklist

The reality in today’s very competitive data center and cloud market is the one who can run most optimally and cost-effectively while still delivering prime services is a leader in the market. To accomplish this goal, there are a few things to consider. First of all, getting ahead doesn’t always mean adding more gear. Smart data center and cloud providers learn to use what they have and make the absolute most out of every resource. There are new kinds of questions being asked when it comes to new data center efficiency concepts. Is there a new technology coming out that improves density? Does the ROI help improve long-term management costs? Does a new kind of platform allow me to achieve more while requiring less?


Beth Israel Launches Big Data Effort To Improve ICU Care

The way clinical care is documented can vary greatly; for example hypertension, high blood pressure and elevated blood pressure are three different terms that describe the same condition. “There has been a lot of data cleanup that needed to be done, and in the process, we’ve learned a lot about structured data, and quality of data,” says Folcarelli. She says it took at least a year to normalize the data and determine the data points that would work well in the model. Statisticians and analysts worked with clinicians and nurses during this process. The hospital’s IT team uses scripts that extract data from the transactional systems—the HIS, the clinical ICU systems, the HR systems—on a regular basis. The extracts are sent to the hospital’s clinical data warehouse, which is built with Microsoft SQL Server technologies.


Your Next Car Could Reveal More About You Than Your Facebook Profile

About 90 percent of new vehicles in western Europe will be able to send and receive data by 2020, compared with roughly one-third next year, Hitachi Ltd. estimates. Once hooked into the web, the car’s driving data could be coupled with information as detailed as a driver’s contact list, favorite routes to work and even financial information from mobile-payment systems. As cars get closer to driving themselves, their cameras and sensors will collect data about what happens in and around the vehicle and what passengers are doing. That prospect has created disputes about what data can be collected and who needs to agree to it. Rules in this area could hamper automakers from fully tapping their newfound gold mine.


Blythe Masters Tells Banks the Blockchain Changes Everything

In a matter of months, this word, blockchain, has gone viral on trading floors and in the executive suites of banks and brokerages on both sides of the Atlantic. You can’t attend a finance conference these days without hearing it mentioned on a panel or at a reception or even in the loo. ... Now, everyone’s trying to figure out whether the blockchain is just so much hype or if Masters’s firm and other startups are really going to change the systems that process trillions of dollars in securities trades. When investors buy and sell syndicated loans or derivatives or move money around the world, they must cope with opaque and clunky back-office processes that rely on negotiated contracts between buyers and sellers, lots of phone calls, lots of lawyers, and even the occasional fax. It still takes almost 20 days, on average, to settle syndicated loan trades.



Quote for the day:

"Everybody wants to do something to help, but nobody wants to be the first." -- Pearl Bailey

September 02, 2015

5 IT experts reveal their Windows 10 upgrade strategies

There are support costs, management issues, security problems and a host of other deployment snafus that can crop up.  Yet, the new OS is a major step forward. Microsoft resolved many of the troubling usability issues that plagued Windows 8, such as a confusing “tile” interface and hard-to-find settings. Many features – including a more streamlined update process that won’t interfere as much with daily work – are designed for the enterprise. It’s even easier to do “in place” upgrades.  To help put the finishing touches on your upgrade strategy, CIO.com talked to several experts (including those at Microsoft) about how to make a deployment as smooth as possible. We asked about general guidelines, security issues, usability, training and other considerations for enterprise users. Here’s what we found out.


Data virtualization tools move into strategic IT realm

There [are some] use cases for data virtualization [instead of traditional data integration]. One is [if] it's a new source of data. You may need at some time later on to integrate the data but you want to get to the data now to analyze and look at it, see how useful it is, and you haven't gotten to the point where you can invest in getting it integrated. That's one use case scenario: the precursor of integrating it. There are plenty of other use cases where you never integrate the data with your source of data; you may not own the data. There's social media data, there's Web data, there's data that you might be exchanging between prospects, suppliers, partners and so on, that you may never own or have the ability or desire to integrate with your data.


Of Black Hat and security awareness

Black Hat is a combination of in-depth, mostly hands-on training and briefings that tend to be presentations on various security topics, typically with a focus on security weaknesses. I am interested in briefings in which the presenters demonstrate a successful hack or compromise of something very interesting or familiar. This year’s quintessential Black Hat presentation demonstrated the ability to remotely control connected-car functions. It’s the sort of thing that really sets Black Hat apart. Of course, Black Hat also has the obligatory expo floor, and I enjoyed the opportunity to obtain demos from technology vendors that I currently use or am considering. It’s much easier to ask pressing questions in a venue like this than to schedule individual meetings and then sit through a bunch of marketing slides before getting to the real substance.


Why Startups Should Leverage Compliance

Though this particular measure focuses on payments, the same dynamic can be seen at play in other innovative sectors. During the last several months, Uber has been battling regulators both here in the United States and in many countries abroad, often because of aggravation by taxi unions. What these incidents highlight is the unsurprising fact that if you want to eat the established players’ lunches, you probably have to take their pills too. Despite the many upstarts who decry the stifling effects of regulation, governments have signaled repeatedly that they have no intention of backing down. The proper response from the technology industry is not to bemoan the state of affairs, but to recognize the opportunity to leverage compliance against their competitors.


Lone Rangers of the Underground

The underground market for malware tools, vulnerabilities, exploit kits and every other criminal niche is fully mature. The barriers to entry into the market have fallen away over the years, established criminal toolkits are available at low to no cost, former high value malware such as ZeuS have become almost open source projects, spawning a variety of improvements or imitators and basic tools such as keyloggers or system lockers are being combined to devastating effect. Take for example the Hawkeye attacks that affected small businesses on a global scale, from China through India and Europe all the way across to the United States. A simple $35 keylogger, Hawkeye, was used in sophisticated “change of supplier” fraud by two lone Nigerian criminals.


Bank-in-a-box: An innovative, easily deployable solution

The bank-in-a-box is an integrated solution set that supports the transformation of core banking operations using a service provider or third party developed interface. It is scalable and cost-effective, and includes internet and mobile banking, deposit and loan products, payment solutions, ATM and POS switching, regulatory and MIS reporting. The software can easily be used by non-IT specialists to develop new products. The suite acts as a complete technology solution spanning across multiple delivery channels, between front- and back-office, including reconciliation and settlement. Typically, the hosted core banking platform (based on the SaaS model) is provided by the application service provider. This could take the form of cloud-based hosting or on-premise hosting services.


Why you need to convert IT consumers into investment partners

Several years ago, Joe Spagnoletti, CIO of Campbell Soup Company, brought an investment management approach to IT spend. Today, he and his business partners look at four characteristics when making IT investment decisions: business outcome, operating performance, cost to serve, and risk. "We’ve educated our business leaders about how to think of an IT investment more broadly," he says. "We show them how their current portfolio is performing so they think, 'In a silo, this one investment looks good, but how does it look as a part of a collection?'" Stephen Gold, EVP of business and technology operations, and CIO, CVS Health, employs the "CIO theory of reciprocity." "Let's say the head of sales of a given company suggests, 'If I had a real-time inventory management system, I could increase revenue by $500M,'" says Gold.


Metadata-Driven Design: Building Web APIs for Dynamic Mobile Apps

For the sake of brevity, it can be summarized as an approach to software design and implementation where metadata can constitute and integrate both phases of development. ... While building these apps on iOS and Android, I took note of the additional time that was inherent to their development on a native level, especially when compared to normal desktop applications. Besides the unquantifiable test of an app’s user interactivity, a significant amount of time was required to organize the application’s flow of navigation when using a more complex framework (like Cocoa). Of course, there was also the time needed to submit the app for approval and then the subsequent effort to modify and/or tailor any aspects considered undesirable by the app store’s vendor and/or the app’s users.


Why Israel dominates in cyber security

“Connecting the talent pool coming out of defense organizations with the strong entrepreneurial spirit that exists here, and you get the perfect ingredient for a powerhouse, in terms of cyber security startups and technology companies,” says Mimran. And that connection has been making strides in digital security for decades. For instance, In 1993, Tel Aviv-based Check Point developed FireWall-1, one of the very first protection solutions for Internet-connected computers. The defensive software was developed by Israeli-entrepreneur Gil Shwed, who served in the IDF’s Unit 8200—which is responsible for collecting signal intelligence—and grew the company into one of the country’s biggest tech giants. Check Point foresaw a need for protecting computer networks, and more importantly, filled that need before most people were even online.


Barclays Hacks Its Own Systems to Find Holes Before Criminals Do

Staying ahead of the bad guys requires resources, expertise and vigilance, and even that isn’t always enough. “They improve the ways to get in all the time,” said Oerting, 58. “The reality is that there are actually more cases than you read in the press.” Barclays is boosting spending by about 20 percent as part of its new cyber-defense strategy, Oerting said, declining to elaborate.  Cyber risk is viewed as a key concern by almost a third of banks in the U.K., a survey by the Bank of England found in July. Two years ago, only 1 percent of those surveyed considered cyber attack a major risk. HSBC Holdings Plc, Lloyds Banking Group Plc andRoyal Bank of Scotland Group Plc declined to discuss their efforts to fight computer crime.



Quote for the day:

"In order to succeed, your desire for success should be greater than your fear of failure." -- Bill Cosby

September 01, 2015

How Semantic Graph Techniques Ease Data Integration

Semantic Graph Databases are most valuable for complex metadata applications where the number of classes (i.e. types of objects) change daily, properties within classes change on-the-fly, and it is critical have self-descriptions of data. Grounded in formal logic, semantic analytics can easily encompass associative and contextual concepts for richer data analysis, which provide a more expansive, exploratory querying experience. As noted in David S. Frankel’s article, “How Semantics Can Take Graph Databases to New Levels,” querying a database using formal semantics provides the ability to “infer logical consequences from a set of asserted facts or axioms … Reasoners grounded in formal semantics can be potent tools when managing large graph databases.”


Intel says GPU malware is no reason to panic, yet

While it's true that there is a shortage of tools to analyze code running inside GPUs from a malware forensics perspective, endpoint security products don't need such capabilities because they can detect the other indicators left by such attacks on the system. On one hand, moving malicious code inside the GPU and removing it from the host system makes it harder for security products to detect attacks. But on the other, the detection surface is not completely eliminated and there are trace elements of malicious activity that can be identified, the researchers said. Some of the defenses built by Microsoft against kernel-level rootkits, such as Patch Guard, driver signing enforcement, Early Launch Anti-Malware (ELAM) and Secure Boot, can also help prevent the installation of GPU threats.


Breaking the SQL Barrier: Google BigQuery User-Defined Functions

BigQuery UDFs are similar to map functions in MapReduce. They take one row of input and produce zero or more rows of output, potentially with a different schema. ... BigQuery UDFs are functions with two formal parameters. The first parameter is a variable to which each input row will be bound. The second parameter is an “emitter” function. Each time the emitter is invoked with a JavaScript object, that object will be returned as a row to the query. ... JavaScript UDFs are executed on instances of Google V8 running on Google servers. Your code runs close to your data in order to minimize added latency. You don’t have to worry about provisioning hardware or managing pipelines to deal with data import / export.


Are you a data hoarder? Hadoop offers little choice

There's a bit of absurdity here. If you throw it away, you can't get it back; if you keep it, you can eventually organize and purge what you don't need. Those who store data now while getting their governance in place are not automatically "data hoarders." This is a false dilemma. The idea that you need to come up with a perfect plan before keeping any data or bringing in any new sources is a little like saying we need perfect social justice for everyone before we can address police killings of African-Americans. Instead, get started now. Stop throwing out the baby with the bathwater and begin finding your use cases. Meanwhile, make data the point rather than a side effect of your processes and govern it accordingly. These aren't "steps," but initiatives you need to undertake, usually in parallel.


New Smartphone Attempts to Finally Solve the Storage Problem

The startup is trying to take better advantage of the increasing ubiquity of wireless networks that most of us are already using. Apps, photos, videos, and music can pile up and take up available space on your phone, and Nextbit thinks the solution is to use the Internet to unobtrusively back up and remotely store some of that stuff. By default, the phone does this when it’s plugged in and connected to Wi-Fi, though users can change this. Robin is slated to be generally available online in January or February and will include 32 gigabytes of storage on the phone and another 68 online. It will cost $399, and Nextbit has already raised $18 million in venture funding from Accel Partners for the phone’s development. In an effort to publicize its brand with consumers and drum up early sales,


Revamping Master Data Management with Graphs

One of the more interesting aspects about utilizing graph databases with MDM is the role that Natural Language Processing (NLP) can play in the query process. Interestingly enough, the visual querying framework that Semantic graphs facilitate was described by Aasman as “even simpler than natural language”, especially because the former method does not involve code. Still, there are ways in which NLP can assist with the querying process for MDM systems augmented by graph databases. The most salient of these are when NLP is involved with certain definitions and descriptions of terms that are referred to with multiple spellings, nick names, and perhaps even slang. One of the most cogent examples of this fact is found in a use case in which Franz combined with Montefiore Medical Center to create a healthcare platform with instantaneous querying capabilities of vastly heterogeneous sources.


Six simple cybersecurity rules for all ages

Nowadays parents are getting more and more concerned about what you do on the Internet. They know that there are lots of creepy weirdos and malicious viruses on the Internet; they fear for your naivety, innocence and the potential of severe cyberbullying. Of course, sometimes they go overboard but you still need to deal with it. Do you have a smothering mother or father who wants to know what’s going on in your life both online and off? Sorry, but it’s just the way things are. If you want more freedom behave like any normal adult would do: show your parents that you can make deliberate decisions. You’ll benefit from it as well. Keeping your gaming and social accounts secured is a tangible bonus, isn’t it? As we’ve already written, cybercriminals would readily take over your Facebook page, infect your smartphone with a virus, or steal your gaming account.


New DOD cyber security regulation: is the cure worse than the disease?

In summary, this “interim rule” imposes on DOD contractors and subcontractors a contractual duty to provide “adequate security” from “unauthorized access and disclosure” for a broad array of unclassified information, including controlled technical information, export controlled information, critical information, and other information requiring protection by law, regulation or policy (protections for classified information continued to be provided for under the National Industrial Security Operating Manual (NISPOM)). The interim rule also requires DOD contractors and subcontractors to report directly to the appropriate DOD office a “cyber incident” or “malicious software.”


Latency, Bandwidth, Disaster Recovery: Selecting the Right Data Center

In selecting the right type of data center colocation, administrators must thoroughly plan out their deployment and strategies. This means involving more than just facilities teams in the planning stages. The process to select a good data center has to involve not only the physical elements of the facility but the workload to be delivered as well. ... With the increase of traffic moving through the internet, there is a greater demand for more bandwidth and less latency. As discussed earlier, it’s important to have your data reside closer to your users as well as the applications or workloads which are being accessed. Where data may have not fluctuated too much in the past, current demands are much different.


How PMOs can balance time, cost and quality

Triple constraint – the balancing act that occurs between cost, quality and time – is a term often heard in the world of project management, but what does that mean when it comes to the success or failure of a project to meet organizational objectives? Project managers are tasked with ensuring that they successfully manage the scope of a project to keep it within the cost, quality and time parameters determined by organizations at the onset. So how do project managers balance these three factors. This can be an ominous task, considering there are various internal or external factors that can rapidly change, causing any one or more of the three constraints to shift in an undesirable way. In order to decrease this risk, there are some questions you need to address in the beginning stages. Here six mportant ones that could have a significant impact on project scope.



Quote for the day:

"Continuous improvement is better than delayed perfection." -- Mark Twain