March 10, 2016

Designing a modern enterprise architecture

The reason enterprise architectures must change is the confluence of high-speed connectivity and decades of exponential Moore's Law improvements in computing power. This has enabled cheap smartphones to saturate the market and utility-scale IT service providers to create cloud services. Together, these technologies have catalyzed dramatic changes in business. Whether you call it the New Economics of Connections (Gartner) or the Unbounded Enterprise (AT&T Bell Labs), it means businesses, and consequently IT systems and applications, will increasingly interact not just with people, but devices, virtual objects, and other software in the form of automated business processes and intelligent devices.


Biggest-Ever Blockchain Trial is Only the Beginning

Grant described the trial in similarly ambitious terms, indicating that it sent four technology providers specifications for the test – Chain, IBM, Intel and Eris (which delivered versions of the concept on its platform and Ethereum) – that included design specs for three specific trading scenarios. "We had [banks] issuing, trading and redeeming commercial paper, and we had every one of those banks do that in the platform," Grant said. He explained that all banks were encouraged to transact with at least one other bank over the course of the trial, with Grant suggesting that "at least 60 trades" were completed in the simulations. No real funds were exchanged as part of the test. Grant suggested that two of R3’s partners declined to participate due to what he called a "significant resource requirement".


Is DevOps good or bad for security?

Miller views that as one of the benefits of DevOps. “Because CD emphasizes having a code review process, small check-ins and rapid mitigation come with it. If you can deploy four or five times a day, you can mitigate something within hours.” The same applies to spotting breaches, says Sam Guckenheimer from Microsoft’s developer tools team. “With DevOps, you're worried about things like mean time to detect, mean time to remediate, how quickly can I find indicators of compromise. If something anomalous happens on a configuration, you have telemetry that helps you detect, and you keep improving your telemetry – so you get better detection, you get better at spotting indicators of compromise and you get better at remediation.” Continuous deployment makes life harder for attackers in two ways, Guckenheimer explains.


Context is king: Aruba founder talks about future of wireless

Speaking about upcoming wireless standards, Melkote said that 802.11ad would rise to prominence within the next two years. The 60GHz technology doesn’t propagate over great distances or through thick barriers, but offers the possibility of very high throughput. “Initially, it was envisioned as a high-speed replacement for cable,” he said. “If you’re trying for coverage, it’s not the right technology, but if you’re trying to provide capacity, it can be a good technology.” But he cautioned that it is still very early in the game where 802.11ad is concerned, and that there aren’t even chipsets yet available. “The big thing that I look for here is the economics – can you get to a price point that is palatable for the end user?” Melkote said.


The Data Science Puzzle, Explained

While one may not agree entirely (or even minimally) with my opinion on much of this terminology, there may still be something one can get out of this. Several concepts central to data science will be examined. Or, at least, central in my opinion. I will do my best to put forth how they relate to one another and how they fit together as individual pieces of a larger puzzle. As an example of somewhat divergent opinions, and prior to considering any of the concepts individually, KDnuggets' Gregory Piatetsky-Shapiro has put together the following Venn diagram which outlines the relationship between the very same data science terminology we will be considering herein. The reader is encouraged to compare this Venn diagram with Drew Conway's now famous data science Venn diagram, as well as my own discussion below and modified process/relationship diagram near the bottom of the post.


The Benefits of Hiring Freelance Big Data Experts

One of the major benefits gained from going the freelance route is flexibility. Instead of hiring a full time data scientist to oversee all big data projects within an organization, the company instead hires on a per project basis. This is especially important for smaller businesses, since the time between big data projects at that level can often be lengthy. Passing over the full time option means a business wouldn’t have to worry about paying a big data expert when they have nothing for them to do. Hiring based on the project means a smarter use of limited resources. This added flexibility also leads to choosing data experts based off of their individual talents. For example, if a big data project requires hiring a data scientist with expertise in sales, the small business can do so. Their fees aren’t based off of a salary but rather on the milestones reached in the project.


Digital Hijackers – the rising threat of ransomware

Ransomware is a cyber version of kidnapping, with the same motives: money. It works like a virus that secretly encrypts files. Victims don’t get the key until paying the ransom. It’s as if instead of a thief stealing your car, they took the car keys and put them in a safe left in your garage. You don’t get the combination to the safe, and use of your car unless you pay up. ... As the attacks have gotten more advanced and correspondingly expensive to develop, they have also become more costly, with an average ransom of about $300 per infected host. What is an extortionate annoyance to someone trying to get their family photo library back can be a significant business expense, both in the ransom itself and the indirect costs of operational disruption and cleanup, when faced with a data center full of affected systems.


Defining 'reachability' on the global Internet

Each geographic market has Internet Service Providers (ISPs) that connect customers to the Internet, and those local ISPs connect to larger ISPs that ultimately connect to geographies all over the world. Your website sits in data centers or in the cloud with its own Internet connectivity. This combined connection path between your website and these ISPs is how you get to different markets. These days, every business is Internet based, which means your customer can come from any market. Even a North American-focused company is still concerned about dozens of important markets. Global companies can be connecting to customers in up to 800 markets. Knowing how well your web assets can reach a market allows you to plan business expansion, plan cloud, CDN and hosting investments, and tune your application and performance metrics by market.


VMware Virtual SAN: The Technology And Its Future

The economics of storage are skewed in favor of all-flash for an increasing number of use cases. For me, our experience with the Virtual SAN cluster deployed as part of the Hands-On Lab (HOL) infrastructure in VMworld 2014 was an eye opener. The storage workload generated by 100s of concurrent, constantly churning Labs is not very cache friendly (no surprise here). As such, the VMware IT team used a large number of spindles for the capacity tier of Virtual SAN to deal with the workload “escaping” the cache. In other words, the spindles were needed for performance, not capacity. We realized that an all-flash hardware would require fewer capacity devices and it would cost less! And that was already the case back in 2014. The main challenge with the high-capacity, low cost SSDs is their low endurance (typically below 1 device-write per day guaranteed for 5 years).


What is IT Service Brokering? Find out in this recent paper

In a very simple and easy-to-understand way, Moore explains the differences between cloud service brokering and service brokering, and why brokerage in IT is needed. He analyzes what makes up a service broker and what parts are IT’s responsibility, such as APIs, micro-services and application services. Moore discusses where to start to become a service broker as well as some initial challenges that IT needs to overcome. Service broker is a new operating model for IT and multiple steps, some substantial and time consuming, are needed. Moore talks about navigating this transition throughout the automation, orchestration and transformation phases. Digital disruption is real, and for IT, among many other aspects, it brings a new type of integration delivery.



Quote for the day:


"There is only one thing that makes a dream impossible to achieve: the fear of failure." -- Paulo Coelho


March 09, 2016

How to avoid collaboration overload

How your business take advantage of collaboration without pushing your most valuable employees straight into a burn-and-churn cycle? The answer is data, says Duggan. Being able to track projects, collaborative efforts and interpersonal dependencies is key to making sure no one is taking on too much, and that workloads are distributed evenly so that bottlenecks don't occur, he says. Duggan says that the number-1 barrier to operational efficiency is accurate tracking of interdepartmental dependencies. In the past, CIOs and managers would direct their teams to focus solely on their own projects and the result was a very siloed organization; over the past decade collaboration has become the norm and so the emphasis must change to understand the rewards versus the risks in that new mindset, Duggan says.


5 Chrome extensions that reduce distraction while you work

Time online is more likely to kill productivity than enhance it. Think of all the work hours you’ve wasted scrolling through your Facebook feed or going down the Wikipedia rabbit hole. But with the right Chrome extensions, you can minimize these distractions and actually increase your productivity. ... If you don’t like the reports you’re getting from TimeStats, you need this extension. Rather than blocking websites outright, StayFocusd lets you allot the amount of time you can spend on your favorite distractions. But once you reach that limit, the site is blocked for the rest of the day. StayFocusd is very configurable. In addition to blocking entire sites, it lets you control access to specific subdomains, paths, pages, and content


Cyber security tools tend to pile up. Here’s how to rationalize them

Companies often begin the security rationalization process after accumulating a portfolio of tools over the years (i.e. penetration testers, web-application, and code scanners) or through mergers and acquisitions or shifting business strategies. If your organization has typically purchased every tool, the practice is a great way to spot redundancies. For those who have postponed major purchases, the rationalization process will highlight gaps or where too little attention has been paid and there may be vulnerabilities. Put simply, the best rationalization projects enhance new and more customer-centric ways of delivering services by seamlessly integrating IT into business processes - even as demand grows exponentially.


These technologies will blow the lid off data storage

"Very soon flash will be cheaper than rotating media," said Siva Sivaram, executive vice president of memory at SanDisk. Meanwhile, Seagate has demonstrated its heat-assisted magnetic recording (HAMR) for HDDs, which will enable data densities of more than 10 trillion (10Tbits) per square inch. That's 10 times higher than the areal density in today's highest density HDDs. Seagate expects to work with equipment makers in 2017 to demonstrate HAMR products for data center applications, and in 2018 the company expects to begin shipping HAMR drives to broader markets. These recent technology advances are just the latest chapter in the long story of ever-growing storage needs forcing innovations to meet the new demand.


15 Data Security Policies Ignored by Modern Workers

IT isn’t the only department stretched thin. In the past 20 years the economy has grown nearly 60 percent. Corporate profits have increased 20 percent. And wages have stagnated for most Americans. The workday goes from 9 to 7 and the U.S. is among a small club of nations that doesn’t require time off. See the trend? Despite data security policies, everybody is working fast and hard in a dangerous, connected world. At this breakneck speed, IT policies—designed to educate employees and manage risk—are white noise for the modern worker. Clearly, both parties in this relationship have to change—and clearly, that change won’t be easy. In the meantime, IT can buoy data policies with smart technology that does what employees won’t—like continuously back up laptops to ensure business continuity in the face of anything.


Is this the future of the Internet of Things?

Ambient intelligence could transform cities through dynamic routing and signage for both drivers and pedestrians. It could manage mass transit for optimal efficiency based on real-time conditions. It could monitor environmental conditions and mitigate potential hotspots proactively, predict the need for government services and make sure those services are delivered efficiently, spot opportunities to streamline the supply chain and put them into effect automatically. Nanotechnology in your clothing could send environmental data to your smart phone, or charge it from electricity generated as you walk. But why carry a phone when any glass surface, from your bathroom mirror to your kitchen window, could become an interactive interface for checking your calendar, answering email, watching videos, and anything else we do today on our phones and tablets?


The organisation that runs the internet address book is about to declare independence

Barring any last-minute hiccups, though, something remarkable will happen at the meeting. After two years of negotiations, ICANN is set to agree on a reform that would turn it into a new kind of international organisation. If this goes ahead, a crucial global resource, the internet’s address system, will soon be managed by a body that is largely independent of national governments. And some of ICANN’s champions reckon this is just a start. In future, similar outfits could be tasked with handling other internet issues that perplex governments, such as cyber-security and invasions of privacy. The beauty of the internet is its openness. As long as people stick to its technical standards, anybody can add a new branch or service.


Applying the Scientific Method in Data Center Management

Recently, scientists at the State University of New York at Binghamton created a calibrated model of a 41-rack data center to test how accurately one type of software (6SigmaDC) could predict temperatures in that facility and to create a test bed for future experiments. The scientists can configure the data center easily, without fear of disrupting mission critical operations, because the setup is solely for testing. They can also run different workloads to see how those might affect energy use or reliability in the facility. Most enterprise data centers don’t have such flexibility, but they can cordon off sections of their facility as a test bed, as long as they have sufficient scale. For most enterprises, such direct experimentation is impractical. What almost all of them can do is create a calibrated model of their facility and run the experiments in software.


IoT in education: Gonzaga taps ITSM to manage device growth

"The reality of IoT is creeping into organizations ... but it is showing up to college campuses in force," Coppins said. Other EasyVista higher education customers include Fordham University, Samford University, the University of Barcelona and Villanova University.Schools are indeed taking notice of IoT in education, judging from the IoT-focused conference tracks at recent higher education gatherings. In November 2015, the semiannual meeting of the Western Interstate Commission for Higher Education, for example, featured an IoT session. The session's introduction asked, "What do we do when our students arrive on our campuses in Internet-enabled vehicles, wearing Internet-enabled clothing, carrying eight to 10 Internet-enabled devices and with clear expectations that our systems can support them?"


How will blockchain technology transform financial services?

Evangelists say the possibilities are limitless. Applications range from storing client identities to handling cross-border payments, clearing and settling bond or equity trades to smart contracts that are self-executing, such as a credit derivative that pays out automatically if a company goes bust or a bond that regularly pays interest to the holder. Some go as far as to suggest that the technology even offers the potential to disrupt companies that have forged reputations as “disrupters”, such as Uber and Airbnb. At its core, blockchain is a network of computers, all of which must approve a transaction has taken place before it is recorded, in a “chain” of computer code. As with bitcoin — the first application of the technology, applied to money — cryptography is used to keep transactions secure and costs are shared among those in the network.



Quote for the day:


"Data is not information, information is not knowledge, knowledge is not understanding, understanding is not wisdom." -- Clifford Stoll


March 08, 2016

Use a BPM strategy to modernize legacy applications

As is nearly always the case, enterprise architecture may provide an easy path if an "EA model" is available. It would be fair to say that for a major enterprise to modernize legacy applications on a large scale, it should never proceed without first developing an EA model according to one of the established standards such as TOGAF. Where the scope of application modernization projects is more limited, it's possible to recover business process definitions from current applications. Where you have no EA framework for direct BPM mapping, take application workflows and "abstract" them by grouping application features into the business processes they support.


The Other Side of Agile: Ceremonial Development

As you can see, ironically, the Agile Manifesto is very simple. Good Agile practices are much more in the spirit of Kaizen and continuous improvement, as opposed to the sterile doctor prescription of do’s and don’ts that most people associate with Agile. And when I come to realize it, the most successful teams that I’ve worked with have excelled exactly at this — responding and adapting to change. These teams were great at what they did because they had mechanisms in place for the team to continuously improve its own delivery. Truth be told, they weren’talways great because of their code reviews. Or pair-programming. Or Stand-up meetings. Or user stories. These things were sometimes very important in the delivery, but once something becomes routine, it can be hard to take a step back and evaluate if it is still delivering on its value proposition.


Seagate Reveals World's Fastest SSD

Seagate's new SSD is based on the non-volatile memory express (NVMe) interface, which was developed by a cooperative of more than 80 companies and released in March 2011. The NVMe specification defined an optimized register interface, command set and feature set for SSDs using the PCIe interface -- a high-speed serial computer expansion bus standard used in both enterprise and client systems. Intel's SSD 750 series drive, which also uses the NVMe/PCIe interface. The SSD sports read speeds of up to 2,500MB per second or 2.5GB per second. "The unit could be used in an all-flash array or as an accelerated flash tier with hard-disk drives (HDDs) for a more cost-effective hybrid storage alternative," Seagate stated in a news release about the new SSD.


Interview: Laura Galante, FireEye

“How are we not able to solve this problem? Because we don’t have visibility into it? The suspicion is the data is probably sitting there in the private sector because everyone is feeling this too. The perfect marriage was Mandiant sitting there with all of this investigation data and thinking, what if there is something huge here and IP is going out the door? We didn’t know how to think about it, and Mandiant needed intelligence so they hired a few of us out of government to figure out what the data was, how to model and analyse it and that is just what we did.” Galante worked on the APT1 report that was released in February 2013, and this allowed her to see network data on the host side and not just on the network, and understand what malware is sitting there that sends out these alerts.


Breaking the Glass Ceiling in Indian IT Firms

It is not uncommon for women to face unconscious biasness at work, which may impact them negatively and make them feel out of place in a male largely male dominated industry like technology. For instance, unconscious bias can happen when male team members put in long working hours for a project while the female workers may leave the office at fixed times. This can be misconstrued as the male workers contributing more to the project, whereas in reality, both male and female employees could be contributing the same, or the latter even more for that matter. Organizations are now actively working towards mitigating gender bias and bring in more transparency that would make women feel more inclusive. 


Bimodal IT strategy opens up opportunities for innovation

Today's application lifecycle is measured in weeks, not years, meaning neither customers nor employees have the patience for a lengthy software development process. Organizations that are too slow to capitalize on an emerging digital business opportunity lose out to competitors that move quickly. But such a quick process requires using Agile development practices, fostering close cooperation between developers and IT operations, heavily instrumenting applications to measure performance, feature usage and errors, and employing continuous delivery processes that facilitate a steady stream of bug fixes and feature enhancements.


Scrum is Just a Starting Point | The Clever PM

There is certainly value to be had in looking to prescriptive definitions like those found in the Scrum Guide — they provide us all with a common understanding of the component parts of what that particular publisher or consultancy has defined as “Scrum”. It enables us to have intelligent conversations using such jargon words as Product Manager, Scrum Master, Stand Up, Retrospective, and other terms that have only contextual meaning within the world of Scrum. It also provides those who need guidance and assistance in establishing the foundation for Agile practices with some clearly-defined, specifically-actionable, and proven steps to take and ceremonies to implement to achieve their goals.


DHL Asia-Pacific Innovation Centre incubates future logistics technology

“The innovation agenda is not a new one for DHL,” said Mei Pang, vice-president, innovation, solution delivery and service management at DHL customer solutions and innovation in Asia-Pacific. “From an operational point of view, DHL has always known to come out with new things. In 2007, our corporate office in Germany made a decision to invest in a central team to focus on innovation to look at the future of logistics and identify major trends,” Pang told Computer Weekly. “Part of the initiative was to open a conversation with partners, and the approach we take is a very collaborative one where we work with suppliers, customers and academics to focus on the use cases and try to make them practically applicable in our business,” she added. “That concept worked very well in Germany.”


Intel's Pentium Bug Fix Is Proposed as Solution for Dark Pools

The pitch comes as banks have been beset by fines. UBS was fined $14.4 million by the SEC for problems at its private stock-trading platform. Barclays Plc and Credit Suisse Group AG racked up more than $154 million to settle allegations that they misled investors about how their dark pools were managed. Investment Technology Group Inc. agreed to pay $20.3 million for its infractions. Aesthetic Integration was founded by Denis Ignatovich, formerly head of the central risk trading desk at Deutsche Bank AG in London, and Grant Passmore, a mathematician and expert on formal verification.  Passmore said formal verification uses algorithms to analyze other algorithms. Rather than endlessly trying to test possible outcomes, machine reasoning acts like an automated mathematician, creating proofs and theorems to speed up the work.


Testers in TDD teams

The big QA of the Nineties seems history. Many IT organizations have dissolved their QA departments and have spread their testers over Agile teams. However, in many of those teams, the testers are still doing the same manual testing they did in the nineties. Many organizations are therefore still stuck with the same dysfunctional testing they had twenty years ago. The dysfunctionality of Old school QA lies in its excessive use of functional testers. These are professionals specialized in manual testing, but having few technical skills. Their specialization makes functional testers good in 'testing' functionality. However, old school QA has a tendency (and often a commercial interest) to also use these testers to 'check' functionality.



Quote for the day:


"Goals allow you to control the direction of change in your favor." -- Brian Tracy


March 07, 2016

Making Data Easy for Businesses with Cloud Data Services

By taping into these new cloud data services, they can explore data sources, meld data of different types, select the most appropriate analytics tools, and produce actionable insights. And they can do it without necessarily having to engage the IT department and, in some cases, waiting weeks for answers. It’s a drag-and-drop experience. Or, if they choose, they can enlist the IT department to design more sophisticated analyses. At the same time, these new services offer a host of more sophisticated capabilities designed for data scientists and developers—enabling data scientists to analyze complex situations using the most capable analytics tools and providing software programmers and product teams with a dynamic development platform.


Tech Giants Agree: The FBI’s Case Against Apple Is a Joke

While this seems like a natural cause for the technology industry to rally behind, many tech leaders were initially slow to express support for Apple in the matter. As the New York Times reports, several companies also hesitated to support Apple publicly. Some expressed concern over whether this was the right fight to pick, while others worried about public perception. Those concerns appear to have been allayed, at least on the part of the companies who filed Thursday. Their briefs in support of Apple are unequivocal, and use language as forceful as the company’s own.


The Internet of Things Will Make Big Data Look Small

This looming problem is something we’re sure to discuss at Structure Data, scheduled for March 9th and 10th in San Francisco. We’re featuring speakers such as William Ruh of GE, who will talk about the impact the industrial Internet will have on the manufacturing sector; Jerome Dubreuil of Samsung, who will illustrate just how much data connected home devices generate; and a panel of healthcare experts will sort through the dual challenges of the retiring baby boom generation and an explosion in quantified-self health apps. This may sound like a buzzword salad to many of you. But those in charge of the massive players in this market are making moves to get themselves ready for the data deluge from a realized Internet of things.


Government consults on data sharing

“There is huge potential for improving citizens’ lives through data sharing in the UK,” he said. “This consultation will help make sure we get data right and bolster security while making people’s lives better.” The proposals focus on three aims: improving public services, tackling fraud and debt, and allowing the use of data for research purposes. The consultation also looks at access to identified data that is linked and de-identified using defined processes. It said linked datasets can help “gain new insights into the social and economic challenges that citizens and businesses face”.


Can Trust-Based Private Blockchains Be Trusted?

When collusion occurs amongst blockchain parties, they can rewrite their local records regardless of other parties' interests and protestations. Other parties may not even detect that colluders altered the historical record. Even worse, since there is no way to prove which party has the correct record (ie: the objective state of the ledger), the system breaks with multiple objective states and multiple attendant claims to historical record authenticity, none of which are provable. Using dates to prove the correct objective state of the distributed ledger is both useless and immaterial – data can be backdated, after all, it's just ones and zeros that can be rewritten. So, what happens if the parties choose not to follow the rules and fork the historical record of the blockchain? What mechanisms exist for aggrieved parties to respond to collusion, if detected?


Security ops orchestration for a brave new world

There is a massive shortfall in the number of trained security experts to man a typical Security Operations Center (SOC) monitoring the health and safety of a corporation’s digital footprint. It takes almost a decade for security researchers to acquire the skills to defend against modern-day attacks. Frost and Sullivan has forecast a shortfall of 1.5 million trained security experts by 2020. SOC teams, overwhelmed in handling the deluge of low-impact incidents, fail to respond in time or miss altogether early incident alerts flagging serious attacks. There appears to be a solution to deal with this massive human shortfall and empower SOC teams. Serious efforts are afoot to record process as code — or simply put, to use software to automate repetitive but time-consuming tasks while increasing the productivity of individual security experts.


Decentralized Apps: Key Questions from a Bank Innovation Director

Once you start charging fees for use of your dapps, you have to be clear what you are charging for. Are you charging for the license to deploy an own instance of a smart contract and use of the dapp wallet – a bit like buying an app from an App Store? Or for a service provided by an already deployed smart contract? Arguably, since it is really the miners that provide the service of actually executing and validating the transactions, it’s hard to justify charging a service fee for smart contracts, unless there are many value-add off-chain services bundled together with the dapp. Based on that assessment, we may end up with a 'Dapp Store' model, where folks purchase a license to deploy an instance of a well-written, standards-compliant, tested and proven dapp onto a blockchain.


Regulation holds up P2P lenders - British online banking security

Journalists from the BBC have successfully broken into a team member's bank account using what is known as "SIM swap fraud". The scam works by fraudsters informing the victim's mobile network provider that they would like to swap SIM cards — this means the victim's number is transferred to another SIM. The fraudster with the new SIM now has the number registered to the victim's bank accounts, and can therefore receive any activation code sent by the bank via SMS. The genuine phone is blocked, and the criminal uses the codes to get into the customer's account without needing to know their PIN, passwords or banking customer number.


Mine that data to keep that customer

The ability for financial organisations to make the most of data, monitoring and tweaking performance as they go will have a major impact on all areas of business, from the supply chain to marketing. However, the big retail banking institutions sit comfortably behind the fintech start-up challengers whose business models are founded in the cloud, and whose customers are willing to place their trust in this new approach. Real-time data has a big role to play in engaging with consumers, as it enables organisations to understand their customers’ behaviours and attitudes towards their services, and by extension positively influence customer loyalty. High street banks are starting to recognise the need to get better at segmenting consumers into more narrowly-defined groups, and real-time data and the contextual relevance of engagement have vital roles to play here.


BYOD continues to add challenges for IT leaders

Karsten Scherer, global analyst at TEKsystems has seen a trend in recent data surrounding BYOD, but notes that allowing employees to use personal devices presents unique risks to the enterprise. She suggests that businesses have a strong BYOD plan in place, encourage company-wide security awareness, and acknowledge that a significant portion of breaches are often inadvertently caused by employee negligence, rather than criminal hackers. "Every company has a complex ecosystem of systems creating, storing, accessing and analyzing data," she says. "When you extend that ecosystem to include devices outside of your immediate control, that level of complexity increases. You've effectively increased the size of your security perimeter."



Quote for the day:


"Leadership is unlocking people's potential to become better." -- Bill Bradley


March 06, 2016

Why President Obama’s cyber security plan is one (big) piece of the puzzle

Cyber security is a complicated latticework of disparate yet interconnected elements: public and private entities, domestic and foreign agencies and overlapping legal frameworks. Take the Judicial Redress Act, which President Obama signed into law on February 24. In addition to providing limited access to US courts for citizens of certain countries – court access would be conditioned on covered countries permitting the transfer of personal data – the Judicial Redress Act has other international implications, specifically in the context of US-EU negotiations. The finalisation of the Judicial Redress Act was considered by the European Union as a prerequisite to an umbrella agreement, initialed by US and EU officials last September


Can you take the Internet out of the Internet of Things?

Does it really make sense for every device to have a Wi-Fi chip in either itself or in a gateway, or should all devices route through some always-connected gateway? Based on the growing number of “standards,” varying power, range and data rate requirements, it’s evident there is likely not going to be any sort of IoT topological convergence. This is because, in some cases, a device simply needs to report its proximity to a phone (think beacons), or because a device operating in a challenging RF environment struggles with higher frequency radios used by Thread or Zigbee and are not the ideal technical selection. In many cases, a gateway and a variety of sensors makes total sense.


Exploring Banking as a Platform (BaaP) Model

Network effects impact us all on a daily basis, via social networks and other marketplaces. These same social networks and marketplaces, after having gotten us used to interacting with one another in a different way, are now encroaching on financial services, with payments and lending initially being their target. Smartphones, broadband internet, the 24/7 availability of commerce and data, and social networks have made us organize ourselves very differently than in the past. The Millennial generation, weaned on this new paradigm, now have completely different expectations than their parents or grand parents of communication and commerce. There are other reasons why financial services industry incumbents need to shift to a platform strategy.


Meet tech's new odd couple: the CIO and CMO

While both sides recognize the need for alignment and a joint strategic plan, there remains a disconnect in how each party views its contribution, according to a November 2014 Forrester report on CMO-CIO collaboration. For example, the research, spearheaded by Pattek, found that while about 70 percent of the executives in both groups believe their strategic planning process emphasizes enhancing customer acquisition, retention and loyalty, only 61 percent of marketers think the CIO is actively engaged in that process. In contrast, 76 percent of the IT leaders said the CIO plays an active role. In addition, 70 percent of marketers and 66 percent of tech management executives said they agree that marketing technology plans will gain more support and funding if they're developed jointly by the CMO and CIO.


No, your Raspberry Pi 3 won't overheat in everyday use, says its creator

While a typical workload for the Pi might see the demand on the CPU spike momentarily, in the vast majority of use cases these periods of high CPU utilisation will not be sustained for long periods, he said. "In most use cases you see a very spiky performance profile. So what you're looking at is 'Can I run very fast for a second?' or 'Can I run very fast in bunches of 50ms?'." And while putting a case on the board will increase the temperature, again for the typical user it will not drive the board to become hot enough to throttle its speed - he said. Upton explains the throttling behavior as being a consequence of making the Pi's hardware more powerful.


The Amazing Ways Big Data Is Used In China

The Chinese financial industry is quickly adapting into a Big Data-driven model, too. In 2013 a number of legislation changes regarding use of customer data quickly led to an explosion in the use of Big Data analytics by banks, investment funds and insurance companies. In 2012 it was estimated that the entirety of the heavily regulated Chinese banking industry held around 100 terabytes (100,000 gigabytes) of customer data. By March 2014, just one of China’s “Big Four” banks, the state owned Industrial and Commercial Bank of China, was said to have amassed 4.9 petabytes (4,900 terabytes, or 4,900,000 gigabytes) of mostly unstructured data. Just as it is in the west, this data is mostly used for marketing of retail banking products.


Artificial intelligence brings its brains and money to London

Both DeepMind and its successors involve “deep reinforcement learning” – giving computers the tools to draw conclusions based on large amounts of data, in the way that humans make assumptions based on experience. The potential applications are vast, from helping doctors diagnose patients to spotting faults in infrastructure such as transport networks – and other uses that even its inventors are yet to conceive of. But measuring progress in AI is not easy. The layperson usually cites the Turing test, developed by Bletchley Park codebreaker Alan Turing in 1950. It focuses on whether a computer can convince a human in a blind test that they are talking to another human. But that test, says Shanahan, is more about “tricking” people through mimicry than developing AI genuinely capable of learning.


Getting the greatest value from your cyber security budget at the end of the financial year

As the financial year creeps inexorably towards its close, you’re probably thinking about the best way of wringing every last drop of value from your budget. If you’re concerned about information security and how it affects your business, why not make the most of your available resources by implementing a best-practice information security management system (ISMS), based on the international standard ISO 27001? ... The good news is that it’s very likely you already have many of ISO 27001’s controls in place, so bringing your current practices into line with the Standard could well be within your grasp. The best way to determine how much work you need to carry out is to conduct a gap analysis.


What Keyboards Do Programmers Prefer?

As developers, we all have preferences in the tools we use for work: a powerful machine, one (or two) large screens, having the freedom to choose our OS, our IDE, etc.... Yet in most companies, we rarely pay the the same level of attention to keyboards. The one that comes with your computer (PC or Mac, desktop or laptop) is often the default choice and we almost never challenge its quality and usability, even though a keyboard is one of the most basic tools of our job, allowing us to perform most of our everyday tasks. So why neglect the quality (and the look!) of a tool that we use eight hours a day? This article is an overview of all the different choices made by the developers team behind the insurance comparison site LesFurets.com. And you'll see how every one of them has an approach of its own.


Strategy, Not Technology Drives Digital Transformation

The ability to digitally reimagine the business is determined in large part by a clear digital strategy supported by leaders who foster a culture able to change and invent the new. While these insights are consistent with prior technology evolutions, what is unique to digital transformation is that risk taking is becoming a cultural norm as more digitally advanced companies seek new levels of competitive advantage. Equally important, employees across all age groups want to work for businesses that are deeply committed to digital progress. Company leaders need to bear this in mind in order to attract and retain the best talent.



Quote for the day:


"Every time you have to speak, you are auditioning for leadership." -- James Humes


March 05, 2016

IoT will crash and burn if security doesn't come first

It's important to understand the damage lax security can do -- to your company and the industry -- and address IoT security early. Hibbard said he has seen firsthand how a lot of players in the space do not consider security as a competitive advantage. "If you're thinking about buying or making IoT, offshoring it to an APAC region, make no assumptions that they're going to know anything about security. You won't be able to retrofit it, so if you want it, order upfront," he said. ... "Show your work," he added. "You need ... to make sure you're properly documenting processes that you went through; you want to make sure you get credit later. You don't want to say to the FTC that you don't have the records."


Global fintech survey results: 51 experts reveal 2016 trends

Payments tech continues to be top of mind for the influencers – followed by security and lending. In 2014, the respondents predicted security technology will be the hottest sector in fintech, however, the sector continues to have a large gap between what is available and what is needed in the market, with a huge interest predicted to continue into 2016. ... 43% of the respondents thought Blockchain adoption by banks will be the single largest trend of 2016. Larger deal sizes, an increased geographical spread and capturing the unbanked market followed with almost an equal amount of interest as the key highlight for the coming year.


Cashless societies: The pros and cons

Thanks to its aggressive adoption of IoT, Sweden is on its way to becoming the world’s first cashless society, according to a study from Stockholm's KTH Royal Institute of Technology. Currently, 80% of payments in the country are made by cards. By the end of 2014, four out of every five transactions in Sweden was cashless. Swedes mainly use debit cards (pin required) and the mobile payment app Swish, which is largely responsible for the nation’s decreasing circulation of cash. Eric B. Delisle, founder of the cyber security company ICLOAK, says the more citizens use cashless systems, which require a computer or device, the more people who have preferred living in an analog world will be pushed into the 21st century. This means new security measures will be needed.


Popular WordPress Plugin Comes with a Backdoor, Steals Site Admin Credentials

The hacker's alterations made sure that he was able to control user login, creation and edit commands, intercepting user data before being encrypted, and sending the user's cleartext passwords to wooranker's server. Furthermore, wp-options.php also created an admin account on the infected website, with the credentials support / support@wordpresscore.com, which he could use if anything else failed. All of this meant that wooranker would always have an admin account on all infected websites, and he would always be notified of what passwords users were using when accessing infected sites.


Bridging the operational technology and Internet of Things divide

By its very nature, a connected world has zero tolerance for downtime yet IoT does not only change the requirement for systems availability; it significantly increases the threat landscape, creating greater security risks and challenges. Indeed, while IT may be willing to accept the fact that a very high proportion of organisations (80%) have experienced outages over the last three years, this fact will not play well within OT, which has actively embraced predictive monitoring in order to achieve 100% uptime. Moreover, organisations are also missing out on essential business information. By failing to consolidate OT into the core network, organisations cannot enable CxOs to take advantage of a depth of real-time analytics that should be informing changes to every part of the building, estate and production systems.


The Trends Disrupting The World of Financial Technology

The battle already underway will create surprising winners and stunned losers among some of the most powerful names in the financial world: The most contentious conflicts (and partnerships) will be between startups that are completely reengineering decades-old practices, traditional power players who are furiously trying to adapt with their own innovations, and total disruption of established technology & processes ...  The blockchain is a wild card that could completely overhaul financial services. Both major banks and startups around the world are exploring the technology behind the blockchain, which stores and records Bitcoin transactions. This technology could lower the cost of many financial activities to near-zero and could wipe away many traditional banking activities completely.

How hackers attacked Ukraine's power grid: Implications for Industrial IoT security

Some aspects of the Ukraine cyber-attack remain opaque -- specifically, whether a modular component called KillDisk (a hard disk wiper) actually caused the power outage, or whether it simply made it impossible to restore the compromised systems using SCADA protocols. As if further evidence of a political motive was required, researchers at security companyTrend Micro recently reported that the same combination of BlackEnergy and KillDisk "may have been used against a large Ukrainian mining company and a large Ukrainian rail company" around the same time as the attacks on the power utilities. Whether the perpetrators' ultimate goal was to destabilise Ukraine via coordinated cyberattacks on its critical infrastructure...


Software - Looking into the Future

dominates. Software is changing practically all industries and is the major driver of innovation across all industries. While we used to distinguish components, systems, and services, we today see flexible boundaries driven entirely by business cases to determine what we should package, at which level, and in which component, whether it’s software or silicon. ... Software is getting more complex, more connected, and more life-critical. This complexity’s sources are hidden in the nature of software, which often consists of many components from different vendors and runs on hardware manufactured by different vendors. Also, software teams frequently are multifunctional, and team members are responsible for many activities such as planning, developing, and executing plans, roadmaps, and strategies—without adequate training.


Scrum with Trello

Trello recently passed the 10M user mark and is fast becoming a popular tool for Agile teams of all flavours. Its simplicity and the great web and mobile experience seem to win some teams over versus other more complex solutions out there. It is also pretty un-opinionated on how you use it, which can lead to some confusion as to how best to implement a Scrum process in Trello. I've been talking to a lot of people over the last year about how they're using Trello for their Scrum and Kanban processes, as well as reading everything I could on the internet relating to running Agile processes in Trello. So, today I present to you with the fruits of that labour:


An AI way to make call centre interaction less hideous?

What makes this interesting though, is that it is very different to the usual visions of AI in customer service. These tend to focus on Virtual Assistants – by the likes of Nuance and IPSoft – which want to replace real agents with digital ones wherever possible.In this scenario AI is used to help machines learn from human interactions and these solutions have become part of the “robots stealing our jobs” debate. It is not as cut and dried as many make out, of course. And individuals involved in this type of tech argue that employing Virtual Assistants simply frees up human employees for more sophisticated forms of customer interaction. Yet Farmer is adamant: “We’re the first people to use AI to improve quality [in customer service].”



Quote for the day:


"Authentic leaders will sometimes push and sometimes pull but either way, they will always keep things moving." -- @LeadToday


March 04, 2016

SSD Prices Plummet Again Close in on HDDs

the market for SSDs with PCIe interfaces, which are used by laptop makers such as Apple to attach flash directly to a motherboard, is expected to grow at the highest annual growth rate ever over the next six years. "This growth can be attributed to the advantages of PCIe, which include high speed, enhanced performance scaling, and detailed error detection and reporting," the report said. "Thus, the demand for SSDs with PCIe interface is expected to increase from the client as well as enterprise end users." Samsung, according to TrendForce, will continue to dominate the SSD market this year because of a price advantage it has with TLC-based SSDs using 3D-NAND flash, which Samsung markets as V-NAND. V-NAND stacks silicon cells up to 48-layers high to increase density, thereby reducing cost.


What Happens to Stolen, Sensitive Data?

Bitglass, a data protection company, ran the experiment and released findings in its report "Where's Your Data?" Bitglass researchers created a digital identity for an employee of a fictitious retail bank, a Web portal for the fake bank and a Google Drive account complete with real credit card data. They pretended that the fake employee's Google Drive credentials were stolen via a larger phishing campaign. They leaked those "phished" Google Apps credentials to the Dark Web and tracked activity in the fake employee's online accounts. Hackers did not know that Google Drive activities were being monitored for a month and that files were embedded with Bitglass watermarks. Here's what happened next.


CEOs force CIOs, CMOs into digital transformation bunker

"CEOs are telling CIOs and CMOs to put in place a new foundation for digital business," Cochrane says. "The CEO tells the CMO I want a strategy for customer experience and he tells the CIO to make it happen." IT has to enable marketing with tools and extend those tools to every customer touch point. That requires CIOs to account for every customer interaction with the corporate brand across the call center, physical stores, online and mobile devices. ... Cochrane says that while the CMO has traditionally owned the customer experience with little influence from the CIOs, that needs to change because of the vast amount of information streaming into businesses from social media, as well as from various Internet-connected devices. With the data surface broadening so much, CMOs need help from the CIO.


Graphene sheets for capturing and storing energy

In terms of optics, the problem is when you think of a material, as it gets thinner, it absorbs less light. So, when you go below 50 nanometers, you have a transparent layer. You might have a layer that's 50 nanometers thin, but to the outside world, it looks transparent, because it's too thin to absorb light. But you're trying to marry light with electronic circuits. And as soon as your devices get smaller and smaller, it gets invisible. So now, you have to boost the thickness of the optical layer if you want to operate in the wavelength we're comfortable with. In this program, we're making this leap. We're creating surface structures that absorb light.


Redesigning Wi-Fi may let devices communicate more easily

Most conceptions of the internet of things assume the chips in sofas, wallets, fridges and so on will use technologies such as Wi-Fi and Bluetooth to communicate with each other—either directly, over short ranges, or via a base-station connected to the outside world, over longer ones. For a conventional chip to broadcast a Wi-Fi signal requires two things. First, it must generate a narrow-band carrier wave. Then, it must impress upon this wave a digital signal that a receiver can interpret. Following Moore’s law, the components responsible for doing the impressing have become ever more efficient over the past couple of decades. Those generating the carrier wave, however, have not.


Data Backup and Business Continuity

As information across all industries and businesses becomes increasingly digitized, the importance of ensuring that this information is continuously accessible has never been greater. And as storage technology has evolved from floppy disks to CD-ROMs, DVDs, portable hard drives and offsite cloud backup, the expectation of 24x7 uptime and constant availability certainly hasn't slowed down. Then there's the matter of compliance and regulatory restrictions, which have become increasingly strict as both IT and business best practices have progressed.HIPAA, HITECH, PCI compliance, and myriad related requirements around data capture, storage, transfer and processing have forced backup vendors and technologies to shapeshift both point solutions and integrated software and services.


Bimodal IT is only harmful when oversimplified

Although the bimodal concept can be polarizing, I believe much of the blowback originates from assumptions made due to an unfortunate choice of name, reflexive distaste for analyst buzzwords and particularly the term’s originator, the analyst firm so many love to hate. A common construction takes bimodal to mean bipolar, with IT segregated into two separate, but unequal entities: Mode 1 where all the stuffy IT old-timers live out their days caring for decaying databases and molding mainframes, versus Mode 2 where all the cool kids play with the latest toys and work unshackled from IT bureaucracy and processes. If that’s your view, bimodal is a recipe for disaster: a warring, dysfunctional IT organization.


Using Blockchain Technology in Crowdfunding

Blockchain technology isn’t perfect yet; some might say it’s not even ready for prime time. Today, the primary drawback is how long it takes to authenticate transactions. A transaction today in Bitcoin takes about 10 minutes to clear, and Bitcoin is a microscopic market compared to, say, credit card transactions. Indeed, the Bitcoin community is engaged in a civil war as to how, or even whether, to change the technology to speed up transactions. But you can understand why blockchain technology is attracting so much interest from government and private industry. For example, the music industry is plagued by uncertainty over ownership of rights. The title industry exists because of uncertainty as to the ownership of real estate. Credit card issuers spend tens (hundreds?) of millions of dollars processing and authenticating transactions.


How doctors are turning smartphones into surgeries with video appointments

“We have a good insight into patient needs and demands. We understand there are limits and you can’t treat everybody and that’s why we have a filter system.” But the healthcare technology sector is not just dealing with problems around primary care. The future of this sector could see technology that monitors how patients use medication — containers designed with a mechanism which sends a signal to both doctor and patient confirming that tablets have been taken. Remote sensoring devices, which a patient wears on a troublesome joint, could analyse the problem and lead to a quicker, more accurate diagnosis. And sensors that monitor blood sugar or chemical levels could automatically drive responses to balance those levels.


Are We Winning the Cyber War? A Look at the State of Cybersecurity

As you might expect, the experience of attacks on a daily, weekly or monthly basis were reported less frequently. An alarming trend is that 54 percent of study participants did not know how frequently they experience cyber-incidents. While 73 percent believed they were able to detect and to respond to incidents, 42 percent felt they could only do so for simple attacks. In an era of increasingly sophisticated and persistent attacks, being able to identify and respond to attacks is imperative. Board and executive concern and support for cyber activities are increasing. Eighty-two percent of security executives and practitioners participating reported that boards are concerned or very concerned about cybersecurity. This is not surprising given the higher level of awareness about cyber in general and the number of high profile attacks that we have recently seen.



Quote for the day:


“Adding manpower to a late software project, makes it later.” -- Frederick P. Brooks Jr.


March 02, 2016

Lifting of Iran sanctions brings hope to regional IT industry

“Once sanctions have been terminated, Iran can move forward in using ICT to transform industries across the country,” said Lalchandani. “The extent of these projects will depend largely on whether global oil prices rebound in the coming years. If they do, the increase in petrodollar revenues will help drive considerable transformation initiatives in the public sector, as well as significant modernisation efforts across the energy, manufacturing, telecommunications, finance, transportation and retail verticals.” Meera Kaul, CEO at regional value-added IT distributor Optimus Technologyand Telecoms, agreed, saying this is a big opportunity for the regional IT supply channel. With the sanctions lifted, the $420bn Iranian economy could open up for regional businesses, she said.


Internet of Things generates ROI for many, but roadblocks remain

Less than a quarter of respondents to the latest Tech Pro Research survey said their company is currently using IoT-connected devices to collect data, but more respondents said their business plans to get into the IoT game within the next year. Respondents in those two groups reported a wide variety of uses for data insights, including predicting trends, improving products, capacity planning, R&D and security. Among respondents whose companies who have implemented IoT data collection, 71% said that less than 20% of their IT budget goes towards those efforts, with the majority spending most of their IoT funds on software.


Firms expect greater government cybersecurity oversight

According to the SEC's Office of Compliance Inspections and Examinations, other areas of focus include governance and risk assessment, access controls, data loss prevention, training, and incident response. "We expect continued scrutiny of the areas covered in past years, with new emerging risk areas being evaluated," said Glenn Siriano, financial services leader for KPMG Cyber at KPMG. Those new areas include emerging technologies, new external threat vectors, deeper assessments of third-party vendors, usage of social media, and managing insider threats, he said. And the SEC has been moving beyond conducting inspections and issuing guidance, said Dave Mahon, CSO at CenturyLink.


Virtual insanity: Is 2016 the year users go big on VR?

“No doubt VR will help to create buzz among media, gamers and the niche audience demanding immersive experiences,” said Husson. “But will it offer consumer benefits for the masses? The short answer is: no. In 2016, reach for VR platforms will remain limited. “While the primary use cases will be for immersive gaming and entertainment environments, innovative marketers at retail, automotive, travel or hospitality companies will start piloting VR prototypes to connect in new ways with consumers in the discovery and explore phases of the consumer lifecycle. The vast majority of marketers should not even care about it and have many other things to fix.”


Are site reliability engineers the next data scientists?

It’s no secret that “data scientist” is one of the hottest job titles going. DJ Patil famously proclaimed data scientist “The Sexiest Job of the 21st Century” before moving on to join the White House as the first chief data scientist of the U.S. Once a rarefied in-house role at a few leading Internet companies such as LinkedIn and PayPal, data science has since grown into a global phenomenon, impacting organizations of all sizes across many industries. More recently, a buzzy new job title has emerged from the same group of companies: that of site reliability engineer, or SRE. Will SREs follow the same path of rapid growth that data scientists did before them? Before we dive into that question, let’s consider the context that has led to the creation of site reliability engineering.


How America's Biggest Cities Make Sense Of Their Data

"The question is, how do we use data to allow cities to tackle big and small problems?" says Saf Rabah, VP of product at Socrata. Untouched and unanalyzed government data—what Rabah calls "dark data"—usually sits on enterprise file systems and databases. The city of Seattle, for example, has 1,200 different enterprise systems, says Rabah. Socrata’s job is to make that data usable—but not just for the city. Aside from other government departments, there are three groups that could benefit from data made public: citizens, developers, and advocacy groups. "Citizens have information needs too, like, ‘I need to know how safe my neighborhood is,’ or ‘I’m about to move to a new city.’ Everyone has information needs that are very unique to them at that point in time," says Rabah.


Online Backup: Reliable and Affordable Solution for Data Protection

It keeps important data safe from disruptions and disasters, and provides a way to keep applications and data off-site in highly secured environment. There are great advantages to using backup technology, such as automation functionality and encrypted data. There are some business experts who state that the cloud is not a secure source for important data. However, online backups have encryption capacity to keep data safe. Conversely, hard drive (external) storage is not secure, and could be stolen or misplaced. Online backup is also reasonably priced. By using online backup, companies are given an opportunity to keep important files and documents safe from disarray and disaster at reasonable rate.


Agile Productivity: Willpower and the Neuroscience Approach

You have your impulse self (reptilian brain and limbic system) and a rational self that protects you from that impulse self (see “The Science of Willpower”). Your prefrontal cortex protects you from your impulsive animal mind. But because the deeper layers of brain are older, more energy efficient, and more powerful, the impulse self has more energy than the rational self. You cannot switch off your internal crocodile or monkey. You can only use the neocortex to override them and prioritize rational decisions. But if you are drunk, tired, sleep deprived or distracted, your prefrontal cortex does not work properly. You start making decisions based on immediate gratification (like drinking coffee with sugar to gain energy), not thinking about what will happen next.


How Much Security Can You Turn Over to AI?

Just detecting anomalies can still leave you with a lot of data to look at. A large organization could see thousands of anomalies a day, so Splunk uses further analysis to keep that manageable. Maier expects the tool to surface five or 10 threats a day, in enough detail to make it clear what’s happening (avoiding the problem where noisy or overly complex alerting systems are ignored when they find a real breach). “We have the full picture on the ‘kill chain’ [of the attack]. We provide a security organization with the information, from the compromise point – when did the attacker come in, what was the initial attack vector, when did they expand in this environment, what other files or servers or user accounts did they connect to?


Algorithm Design Techniques: The Assignment Problem

The assignment problem is designed for exactly this purpose. We start with m agents and n tasks. We make the rule that every agent has to be assigned to a task. For each agent-task pair, we figure out a cost associated to have that agent perform that task. We then figure out which assignment of agents to tasks minimizes the total cost. Of course, it may be true that m != n, but that's OK. If there are too many tasks, we can make up a "dummy" agent that is more expensive than any of the others. This will ensure that the least desirable task will be left to the dummy agent, and we can remove that from the solution. Or, if there are too many agents, we can make up a "dummy" task that is free for any agent. This will ensure that the agent with the highest true cost will get the dummy task, and will be idle.



Quote for the day:


"It's not always necessary to be strong, but to feel strong." -- Jon Krakauer,


March 01, 2016

Create maps in R in 10 (fairly) easy steps

There are many options for mapping. If you do this kind of thing often or want to create a map with lots of slick bells and whistles, it could make more sense to learn GIS software like Esri's ArcGIS or open-source QGIS. If you care only about well-used geographic areas such as cities, counties or zip codes, software like Tableau and Microsoft Power BI may have easier interfaces. ... But there are also advantages to using R -- a language designed for data analysis and visualization. It's open source, which means you don't have to worry about ever losing access to (or paying for) your tools. All your data stays local if you want it to. It's fully command-line scripted end-to-end, making an easily repeatable process in a single platform from data input and re-formatting through final visualization.


Infrastructure As Code

Using code to define the server configuration means that there is greater consistency between servers. With manual provisioning different interpretations of imprecise instructions (let alone errors) lead to snowflakes with subtly different configurations, which often leads to tricky faults that are hard to debug. Such difficulties are often made worse by inconsistent monitoring, and again using code ensures that monitoring is consistent too. Most importantly using configuration code makes changes safer, allowing upgrades of applications and system software with less risk. Faults can be found and fixed more quickly and at worst changes can be reverted to the last working configuration. Having your infrastructure defined as version-controlled code aids with compliance and audit. Every change to your configuration can be logged and isn't susceptible to faulty record keeping.


The Hybrid Cloud: Your Cloud, Your Way

No matter where the journey begins, one of the first realizations is that there is no one particular solution or one particular answer in how to best utilize cloud solutions. The journey typically evolves over time and requires multiple clouds with a combination of both public, private and possibly managed clouds- resulting in a hybrid cloud end state. Before deciding on a cloud approach, it is important to understand all of the possibilities that cloud technologies provide, and agree on business initiatives, priorities, and desired results required to support your business needs and intended outcomes. The decision should not focus entirely on which type of cloud to deploy – private, public, managed or hybrid – but rather focus on delivering the right cloud or clouds, at the right cost, with the right characteristics (i.e. agility, costs, compliance, security) to achieve your business objectives.


Skyhigh Networks Unveils Industry’s First Cloud Security Reference Architecture

The Skyhigh Cloud Security Reference Architecture recognizes the complexity of today’s modern enterprises, where users are mobile and work from a variety of locations, both on premises and remote, using a variety of devices, both managed and unmanaged, to access thousands of cloud services, both IT sanctioned and unsanctioned. It also advises on which use cases and environments are best suited for the most common CASB deployment modes. “As the first CASB player in the market with the most number, scale, breadth, and maturity of CASB deployments, Skyhigh continues its quest to help organizations securely adopt cloud services,” said Rajiv Gupta, “We hope the reference architecture helps organizations cut through the noise so they can leverage the power of cloud services using the most advanced security technologies on the market, both existing and new.”


International regulators take an interest in crypto-currencies & the blockchain

“… distributed ledger technology has the potential to revolutionise financial services … However, … there are a lot of regulatory and consumer issues … to be discussed as the technology evolves. For example, how individuals gain access to a distributed network and who controls this process, [and] what data security exists for users … Innovation can be an iterative process … During … development, it’s crucial that innovators are allowed the space to develop their solutions. The FCA continues to monitor … this technology but is yet to take a stance … In the meantime, we continue to work with firms … to ensure consumer protections are being factored in during the development phase … We are particularly interested in exploring whether block chain technology can help firms meet know your customer or anti-money laundering requirements more efficiently and effectively.


Most software already has a “golden key” backdoor: the system update

From an attacker perspective, each capability has some advantages. The former allows for passively-collected encrypted communications and other surreptitiously obtained encrypted data to be decrypted. The latter can only be used when the necessary conditions exist for an active attack to be executed, but when those conditions exist it allows for much more than mere access to already-obtained-but-encrypted data. Any data on the device can be exfiltrated, including encryption keys and new data which can be collected from attached microphones, cameras, or other peripherals. Many software projects have only begun attempting to verify the authenticity of their updates in recent years. But even among projects that have been trying to do it for decades, most still have single points of devastating failure.


ATMZombie: banking trojan in Israeli waters

The Trojan is dropped into the victim machine and starts the unpacking process. Once unpacked it stores certificates in common browsers (Opera, Firefox) and modifies their configurations to match a Man-In-The-Middle attack. It eliminates all possible proxies other than the malware’s and changes cache permissions to read-only. It than continues by changing registry entries with Base64 encoded strings that contain a path to the auto-configuration content (i.e. traffic capture conditions using CAP file syntax) and installs its own signed certificate into the root folder. Later it waits for the victim to login to their bank account and steals their credentials, logs in using their name and exploits the SMS feature to send money to the ATMZombie.


Hybrid Cloud Versus Hybrid IT: What’s the Hype?

The difference between hybrid cloud and hybrid IT is more than just semantics. The hybrid cloud model is embraced by those entities and startups that don’t need to worry about past capital investments. These newer companies have more flexibility in exploring newer operational options. Mature businesses, on the other hand, need to manage the transition to cloud without throwing away their valuable current infrastructure. They also deal more with organizational change management issues and possible employee skill set challenges. The new, bimodal IT model is also a concern for these enterprises, Forbes reported. This is a tricky dilemma because both hybrid cloud and hybrid IT have been known to deliver some pretty significant advantages. Some of the biggest benefits of moving to an updated cloud or IT environment include:


Millions of OpenSSL secured websites at risk of new DROWN attack

According to the researchers who found the flaw, that could amount to as many as 11.5 million servers. How bad is DROWN really? Some of Alexa's leading web sites are vulnerable to DROWN-based man-in-the-middle attacks, including Yahoo, Sina, and Alibaba. Thanks to its popularity, the open-source OpenSSL is the most obvious target for DROWNing, but it's not the only one. Obsolete Microsoft Internet Information Services (IIS) versions 7 and earlier are vulnerable, and editions of Network Security Services (NSS), a common cryptographic library built into many server products prior to 2012's 3.13 version, are also open to attack. You can find out if your site is vulnerable using the DROWN attack test site.


Ten server deployment checklist considerations

A comprehensive server deployment checklist involves a lot more than buying adequate computing resources at an attractive price. It takes talented IT administrators and other personnel to source, acquire, prepare, install, configure, manage and support a fleet of servers -- whether in the tens, hundreds or thousands -- in a data center. The emphasis on reducing data center hardware footprints and lights-out operations can sometimes cause IT staff to overlook important issues. These top 10 logistical considerations should factor into every rack-and-stack server deployment checklist.



Quote for the day:


"And the little screaming fact that sounds through all history: repression works only to strengthen and knit the repressed." -- John Steinbeck