June 24, 2015

Oracle's biggest database foe: Could it be Postgres?
Gartner, for example, forecasts that more than 70% of new in-house applications will be developed on an open-source database by 2018, and that 50% of existing commercial RDBMS instances will have been converted to open-source databases or will be in process. In other words, open-source databases are almost certainly cutting off Oracle's oxygen when it comes to new applications, but it may also be cutting into its hegemony within existing workloads. If true, that's new. Though from a biased source, an EnterpriseDB survey of Postgres users certainly suggests that Postgres users are running the venerable open-source database for increasingly mission-critical workloads, including those that used to pay the Oracle tax:


Infographic: Must Read Books in Analytics / Data Science
There are 2 attributes all the members in our team at Analytics Vidhya share: We all are voracious readers; and We all love to share our knowledge with people in simplified manner, so that everyone gets access to this knowledge. These two attributes lead us to naturally gravitate towards sharing some of the best reads we come across. You can think of this infographic as an ideal list of books to have in bookshelf of every data scientist / analyst. These books cover a wide range of topics and perspective (not only technical knowledge), which should help you become a well rounded data scientist.


Snowflake Launches Virtual Data Warehouses On AWS
Snowflake isn't a data warehouse of big data dimensions or routine enterprise data dimensions. Rather, it's a virtual data warehouse that will be sized to match the job sent to it. When the analytical tasks are finished, the warehouse shuts itself off to save overhead. "In other cloud data warehouses, you would have to unload the data to turn it off and then reload it [to use it again]," he said. Snowflake avoids that data movement task. Although Snowflake runs on AWS at its US West facility in Oregon, customers may use Snowflake without an AWS account. They also don't need to understand the ins and outs of Amazon virtual machine selection.


Report Template for Threat Intelligence and Incident Response
When handling a large-scale intrusion, incident responders often struggle with obtaining and organizing the intelligence related to the actions taken by the intruder and the targeted organization. Examining all aspects of the event and communicating with internal and external constituents is quite a challenge in such strenuous circumstances. The following template for a Threat Intelligence and Incident Response Report aims to ease this burden. It provides a framework for capturing the key details and documenting them in a comprehensive, well-structured manner. This template leverages several models in the cyber threat intelligence (CTI) domain, such as the Intrusion Kill Chain, Campaign Correlation, the Courses of Action Matrix and the Diamond Model.


Startup’s Lightbulbs Also Stream Music
The speaker bulb, which twists into a standard-size light socket, contains white and yellow LEDs, the brightness or dimness of which coördinates wirelessly with other Twist bulbs that contain just LEDs. Astro, which plans to ship the gadgets early next year, says a starter pack with two LED bulbs, a speaker bulb, and a handheld dimmer switch will cost $399, reduced to $249 for two months to encourage people to sign up. While companies like Philips Hue focus on automating and customizing the lights themselves, Twist is among a handful of companies thinking of the lightbulb as a conduit for wireless audio, too. The company says it plans to add additional functions in the future as well.


Why It's Worth Divorcing Information Security From IT
Too often, when Security reports to IT, we find the IT mentality interferes with security processes and priorities. These days, there is little to no common ground between keeping IT systems up and running for authorized users and monitoring them for signs of compromise by smart, stealthy criminals. Identifying and securing an already compromised system requires the capability to differentiate malicious activity from normal behavior, and hackers are very good at making their activity look normal. The only way to find them is through a combination of new technologies and human judgment. Being a subdivision of the IT department makes security blind to important business processes and to decision making at the corporate and department level.


Aligning Private Cloud and Storage: 4 Considerations
Firstly, the private cloud offers a greater degree of control than the public cloud, especially with data. When you build a private cloud, you’re able to keep your data at your fingertips, establish performance levels that your organization demands to best serve end-users and customers and set security policies that align with your customer responsibilities or industry regulations. Secondly, private cloud gives you more control of applications. Most public clouds require apps to fit their cloud mould, but a lot of businesses have unique, custom-made applications and recoding these applications to fit the public cloud is not a good solution.


Finance Hit by 300 Times More Attacks Than Other Industries
As can be expected, cyber-criminals are working hard to ensure their attacks are as successful as possible, firing a large volume of low level threats at their targets in order to distract IT security professionals while the main targeted attack is launched, Websense said. Obfuscation, malicious redirection and black hat SEO have become popular of late, although patterns apparently shift on a month-by-month basis – again to improve success rates. Targeted typosquatting is also making a comeback in the sector, usually in combination with social engineering as part of spear phishing attacks designed to compromise a host or trick a user into instigating a payment or transfer of money, the report claimed.


Mobile app testing for fun and profit
"If you're doing testing for a mobile website you can more or less use the same tools as you would when just testing out a normal website with your browser," Prusak said. "Ultimately, it should still work with your browser, and there are plug-ins and extensions which work with today's browsers which you can modify HTTP headers or even the resolution and make the backend still think you're connecting on a mobile device." "I'm aware of a plethora of different solutions and all of them require either jail-breaking the device or installing software on your computer and then pointing your phone or device to your computer and using that as a proxy," Prusak said. He sees these solutions as rife with issues, inefficient, and too complex.


Why You Should Definitely Migrate Existing Apps to the Cloud
100% security is an illusion. If you have to make a decision based on the available choices, cloud services are in no way less secure than any of the existing systems in place. Cloud service providers are known for their innovations. It is apparent that at any point in time they would implement better physical and logical security practices than a standalone on-premise data center operation. Many cloud providers are now ISO, PCI DSS, EU Model Clauses and other global security agencies certified. Moreover not all the applications require a bank grade security. Do they? In case you’ve highly sensitive data or your app is subjected to specific security & privacy regulations (such as HIPAA and HITECH) you can opt for Hybrid Cloud Service



Quote for the day:

"The time is always right to do what is right." -- Martin Luther King Jr.

June 23, 2015

Lack of trust in tech is damaging smarthome industry
Although younger people are twice as interested in smarthome technology than the older generation, they do not necessarily have the chance to adopt these technologies because many are living at home longer or are renting from a landlord and therefore have less control than a homeowner over energy consumption. “Younger people seem to be missing out,” Wetherall said. “Younger people are more excited by the novelty, by the ability to play with these technologies.” Smartmetering should fuel engagement with young people by helping them to understand energy consumption, and give them the means to engage with their landlord about making their property better and more efficient.


Bitnation Pangea Releases Alpha of Governance System Based on the Blockchain
Bitnation Pangea wants to be the world’s first blockchain powered Virtual Nation, able to provide all services that traditional governments provide and replace the nation state system with a voluntary form of governance. ... “The alternative the world is currently pivoting towards is U.N.-style global organizations, which would be an even worse ‘one-fit-all’ type governance model than what we currently have,” says Tarkowski Tempelhof. “Bitnation aims to prevent that, through setting a precedent for voluntary competing service providers, powered by the Bitcoin blockchain technology, effectively creating an open source cryptonation protocol.”


Cyber Security in Aviation
The increase of technology does not match the increase in technology security. Duggal said, “technology moves so fast, security sometimes gets left behind because you’re trying to get to the consumer, you’re trying to give them what they want, and sometimes when you try to address security after the fact you add complexity to the mix.” The threat level is increased when systems are not secured prior to installation. Security is often overlooked when ensuring for the consumer’s satisfaction with a rapid implementation and deployment. Making the consumers happy with the latest and greatest technology without first securing the systems before installation merely increases the threat level.


Simple is beautiful: Useful questions to cut through process complexity
As we elicit more information about the process, they may tell us about every logical branch and every exception, and we’ll gain a really rich understanding of the existing situation. This is very useful and will aid our analysis – after all, we’ll need to ensure that our processes can cater for the real environment and are useful in practice. However, it’s also important that we understand (and in some cases challenge) the need for each layer of complexity. In some cases, we may find that particular branches and steps are no longer relevant, and we may be able to simplify the overall process by eliminating them. Doing so may well make our customers’ lives easier, and ensure that the process is as quick, slick and as cost effective as possible. It can be a real win/win.


Vert.x 3, the Original Reactive, Microservice Toolkit for the JVM
Vert.x 3 also has built in support for RxJava - we provide Rx-ified versions of all our APIs so if you don't like a callback based approach which can sometimes be hard to reason with especially if you're trying to co-ordinate multiple streams of data then you can use the Rx API which allows you to combine and transform the streams using functional-style operations. We're also looking into an experimental new feature for Vert.x which allows you to write your application in a classic synchronous style, but where it doesn't actually block any OS thread, the idea being you can get the scalability advantages of not blocking OS threads but don't have the callback hell of programming against asynchronous APIs., i.e., have your cake and eat it. We think this could be a killer feature, if we get it right.


Spark at the Center of a Technology Revolution
All of our connected devices are fueling a growth in data that is completely new to everyone. Starting 3 years ago, we generated more data than we created in the 199,997 years of human history leading up to that point. What this starburst of data means is that how we think about data and technology needs to change at the most fundamental level. It’s not just a question of scale—the types of data and the potential for the way they impact human life and the globe are different at the core. Traditional approaches are either not going to function with the new, massive amounts of data, or they are not going to produce results that are relevant in a world where real-time feedback from devices wired into everything from human heartbeats to interstellar data is flowing constantly and at an increasing rate.


Top 4 Strategies for NFV Success
Just think of the benefits: replacing dedicated hardware appliances across the network with standard servers, general-purpose storage, and standardized software applications – not to mention virtualization to deliver any network function end-to-end. There’s no doubt that NFV can deliver tremendous rewards for network operators in terms of flexibility, scalability, and cost-efficiency. What service providers can’t afford to overlook, however, is how NFV may affect their connectivity infrastructures and network topologies. There are ramifications for transport networks that must be considered, in order to maximize the benefits and revenue opportunities that NFV promises. One such area is virtual customer premises equipment (vCPE). NFV has the potential to radically disrupt this space.


Christina Page explains how Yahoo keeps its datacentres green and clean
“It’s a low-tech design and is cheaper to build because you’re not installing these big chiller systems inside. It’s also more reliable as there are fewer moving parts to fail,” she says. Facilities built according to YCC specifications have a long and narrow “chicken coop-style” design, says Page, to encourage outside air to circulate inside and ensure just 1% of a building's total energy consumption is being drawn on to cool it. “What we’ve done is sited in places where there are few enough hot and humid days of the year so this designs really works. “At the time we were being conservative, and what we’ve concluded is that there are other locations where there are more hot and humid days that work just as well with this technology,” she says.


The False Dichotomy Between Planned and Improvisational Projects
Another way to look at the difference is the costs and benefits of individual innovation in the two environments. If everyone building a WalMart was constantly trying out radical new ideas, the result would be chaos. There is certainly innovation in commercial construction, but it has to be managed centrally to avoid interference. An electrical contractor doing things differently might make a small improvement but risk large downstream costs. By contrast, what was the cost and value to Facebook of someone going off and implementing photo tagging? The cost was small and the consequences on the rest of engineering was also small.


Q&A on Fifty Quick Ideas to Improve Your Tests
Like beauty, quality is in the eye of the beholder. This innate subjectivity can lead to wide ranging opinions on what good quality is and what attributes display those qualities. To ground understanding it is essential to quantify and visualise quality. This works on several levels. At a story or feature level we quantify a quality target in the form of acceptance criteria, we can also set a holistic picture of quality at a product level. Many teams use acceptance criteria for stories these days but criteria are often still ambiguous, like ‘must be fast’ or ‘must be reliable’, which leaves vast potential for error in the suitability of solution. We’ve found it useful to quantify quality at both feature and product level. Then there is a clear target for discussing feature acceptance, and also a higher level vision of quality that the feature falls within and that directs testing.



Quote for the day:

"To lead the people, walk behind them." -- Lao-Tzu

June 22, 2015

The one which offers 10 answers to the question: ‘What purpose does this biochip serve?’
The State Road Safety Inspection has long abandoned their hopes of the impossibility to fake beacons, “flags”, licenses and badges (no hologram would protect from that, really). That’s why, once a vehicle is pulled over by the police, an officer checks the driver’s license and a vehicle certificate against their database to find out whether the piece of plastic is legitimate (and whether the bearer is a good guy). How would the entire procedure look with a biochip in play? The officer presents the reader through the windshield, I touch it with my hand – and that’s it.


Big data log analysis thrives on machine learning
Clearly, automation is key to finding insights within log data, especially as it all scales into big data territory. Automation can ensure that data collection, analytical processing, and rule- and event-driven responses to what the data reveals are executed as rapidly as the data flows. Key enablers for scalable log-analysis automation include machine-data integration middleware, business rules management systems, semantic analysis, stream computing platforms, and machine-learning algorithms. Among these, machine learning is the key for automating and scaling distillation of insights from log data. But machine learning is not a one-size-fits-all approach to log-data analysis.


How a grocery delivery service became a red hot robotics company
"The ultimate aim is for humans to end up relying on collaborative robots because they have become an active participant in their daily tasks," says Dr Graham Deacon, Robotics Research Team Leader at Ocado Technology. "In essence, the SecondHands robot will know what to do, when to do it and how to do it in a manner that a human can depend on." To get a sense of what these collaborative robot helpers will be doing, imagine an Ocado warehouse. Conveyor belts zip colorful baskets to and fro along diverging paths, placing them in front of an army of human workers who pack them full of groceries. The warehouse is full of machinery, and all of it requires careful and constant maintenance.


Decision Boundaries for Deep Learning and other Machine Learning classifiers
With using {h2o} on R, in principle we can implement “Deep Belief Net”, that is the original version of Deep Learning. I know it’s already not the state-of-the-art style of Deep Learning, but it must be helpful for understanding how Deep Learning works on actual datasets. Please remember a previous post of this blog that argues about how decision boundaries tell us how each classifier works in terms of overfitting or generalization, if you already read this blog. It’s much simple how to tell which overfits or well gets generalized with the given dataset generated by 4 sets of fixed 2D normal distribution. My points are: 1) if decision boundaries look well smoothed, they’re well generalized, 2) if they look too complicated, they’re overfitting, because underlying true distributions can be clearly divided into 4 quadrants with 2 perpendicular axes.


The Advantages Of An Agile Company Culture
The real change comes from the company culture. Is the company still a command-and-control type of environment? Agile is about quickly adapting to change, and not being afraid to fail. As a leader, you need to create the type of environment where failure is not only accepted, but actively encouraged. Agile is more about how your team approaches problems, not the tools used to solve them. In an agile environment, employees are expected to communicate frequently, because internal feedback is important to improving the team. The constant learning and iterative nature of agile means that you need to embrace failure and allow that learning to occur.


Can We Design Trust Between Humans and Artificial Intelligence?
What is it that makes getting on a plane or a bus driven by a complete stranger something people don’t even think twice about, while the idea of getting into a driverless vehicle causes anxiety? Part of this is that we generally perceive other people to be reasonably competent drivers—something that machines can probably manage—but there is more to it than that. We understand why people behave the way they do on an intuitive level, and feel like we can predict how they will behave. We don’t have this empathy for current smart systems.


Who Will Own the Robots?
It is notoriously hard to determine the factors that go into job creation and earnings, and it is particularly difficult to isolate the specific impact of technology from that of, say, globalization, economic growth, access to education, and tax policies. But advances in technology offer one plausible, albeit partial, explanation for the decline of the middle class. A prevailing view among economists is that many people simply don’t have the training and education required for the increasing number of well-paying jobs requiring sophisticated technology skills. At the same time, software and digital technologies have displaced many types of jobs involving routine tasks such as those in accounting, payroll, and clerical work, forcing many of those workers to take more poorly paid positions or simply abandon the workforce.


A Manifesto for Creating Extraordinary Teams
Well, there's a name for that state of mind, it's called "flow" and a good friend of mine, Dr. Judy Glick-Smith, has been studying it for years. She recently wrote an article about it that captures perfectly what flow is all about and how to create teams that sustain a flow-state. I'm borrowing heavily from it here because it is a manifesto that I believe every leader should know by heart. Yes, I'm looking at you! ... Creativity and innovation are the inevitable results of unfettered team-flow. If all of these components are in place, each individual in the organization becomes a leader. Change is integrated into the fabric of the culture. Your people will embrace change, because they are creating it on a moment-by-moment basis.


Do the mobile developers your hire thoroughly understand the internet of things
When you hire mobile app developers to create modern apps working on such multiple devices, they should have their concept clear regarding IoT and its user experiences as well as intricacies involved in it. For mobile app designers, HCI are taking place in variety of contexts due to mobility involved in case of mobile devices. Designers have to deal with different resolutions and scale designs accordingly. They have to address resolutions of tiny top of wearable smart watches at one end, go to smartphones, tablets, desktops, and on the TV user interfaces.


DockerCon 2015: Game On
The bug that is being put in our ear is that enterprises are worried about security. Well, yeah, enterprises are always worried about security, but that’s not the point. While on the one hand, Docker does not present a conventional “attack surface” for the typical malicious user, it also does not present a conventional platform for the typical security vendor or security service. All security now, whether containerized or virtualized or on Facebook’s bare metal servers, is no longer a matter of hardening endpoints, but rather of maintaining the desired state of connections in the network. At this moment, even after a few years of rapid development, we don’t really know what a containerized network will look like, once the architectural debates get settled.



Quote for the day:

“No great manager or leader ever fell from heaven, its learned not inherited.” -- Tom Northup

June 21, 2015

Nest keeps smart home portfolio neat and tidy with latest upgrades
The Nest team on the show floor demonstrated how each of these products (and more made by others) can work together on the connected fabric. For example, the Dropcam can communicate with the Nest Theromostat to automatically turn on motion alerts when the thermostat is set to "Away." Dropcam can also record clips when Nest Protect detects smoke. But rather than trotting out even more different connected appliances throughout the home, Nest is working harder with what it already has, which not only offers the possibility of reducing clutter but simply costs for buying multiple gadgets in the long run. With Nest, Fadell elaborated consumers don't have to choose a bundle of products or platform -- they only have to start with one, which can be accessed, monitored and managed from anywhere worldwide through a mobile device.


The Startup Illusion
Contemporary entrepreneurs no longer adhere to the traditional model of professional success; work hard for many years and, one day, you’ll “make it.” Instead, today’s collegiate youth have bought into the Zuckerberg model. Young, aspiring entrepreneurs believe that with a brilliant idea and a little bit of luck they, too, can be billionaires, potentially overnight. Entrepreneurs are choosing to embrace the lie of probable success whilst ignoring the daunting statistics that contradict such thinking, such as the fact that 80 percent of startups fail within 18 months. I, too, turned a blind-eye. It’s difficult not to buy your own hyperbole. However, after an entire year’s worth of work crumbled in a matter of hours, I opened my eyes and saw through the startup illusion.


What the FCC's new robocall rules mean for your company's marketing efforts
Telemarketing efforts are widely used in both business-to-consumer and business-to-business marketing efforts, and the stakes are high. Earlier this year, Twitter called on the FCC to rule that those who call or text a wireless phone number for which consent was previously given should not be held accountable if that’s no longer the case when it is reassigned. Twitter did not respond to a request to comment for this story. “My hope would be that robocalling wouldn’t be part of any corporation’s communication strategy in 2015,” said author and Internet marketing consultant Brian Carter. The marketing world has moved toward opt-in communication, Carter noted.


Beyond Automation
Intelligent machines, Nicita thinks—and this is the core belief of an augmentation strategy—do not usher people out the door, much less relegate them to doing the bidding of robot overlords. In some cases these machines will allow us to take on tasks that are superior—more sophisticated, more fulfilling, better suited to our strengths—to anything we have given up. In other cases the tasks will simply be different from anything computers can do well. In almost all situations, however, they will be less codified and structured; otherwise computers would already have taken them over. We propose a change in mindset, on the part of both workers and providers of work, that will lead to different outcomes—a change from pursuing automation to promoting augmentation.


The Benefits of a Cloud Integrated Hyper-converged Architecture
The key benefit of an HCA though is its inherent simplicity. This is especially true if the architecture is delivered in a turnkey fashion that includes hardware and software, allowing the architecture to be scaled out as easily as adding additional bricks to a stack of Lego blocks. The result is a quicker time to value, since implementation is far simpler, and thanks to the integration there are fewer components to manage. The end result is a reduced total cost of ownership that allows the business to more rapidly extract value from their IT investments. HCA allows an organization to deliver IT services in the same way that large public cloud providers do, essentially creating a private cloud.


Elon Musk To Build A Hyperloop Test Track, Puts Out Call For Pod Designs
SpaceX says it is not getting into the loop business, merely that “it is interested in helping to accelerate development of a functional hyperloop prototype,” says a spokesman. There are still scores of engineering and mechanical issues to resolve around safety mechanisms, costs, propulsion and suspension systems and manufacturing techniques. Teams are welcome to submit entire pod designs, individual subsystems or safety features. SpaceX says it will also likely build its own pod, which will not be eligible to win the competition. Criteria for winning the competition come out in August.


What role does artificial intelligence play in Big Data?
Analysing large data sets requires developing and applying complex algorithms. To date, humans had to come up with hypotheses, identify relevant variables and then write algorithms to test these theories against the information collected in big data sets. However, as data sets become larger, the ability for humans to make sense of it all becomes more difficult, and limits the insights that can be gained from all this information. AI allows organisations to add a level of intelligence to their Big Data analytics to understand complex issues quicker than humans are able to. It can also serve to fill the gap left by not having enough human data analysts available.


DevOps Deep Dive: Infrastructure as code for developer environments
By taking the Infrastructure as Code approach to this problem, you gain a flexibility and extensibility for your solution and overcome the limitations of image based configuration. Specifically, because we defined our desired end state in code, we can modify the configuration attributes, we can change the versions of the software being installed, we can change the location of the software repository, or change the plugins required, or modify any aspect of the desired configuration all by changing the code. Contrast that flexibility against the static nature of an image, and the volume of work required to make a small change to an image-based configuration.


Probabilistic Project Planning Using Little’s Law
Little's Law helps us take an "outside view" on the project we are forecasting, based on knowledge about actual system performance on a reference class of comparable projects. The method can help any team that uses user stories for planning and tracking project execution, no matter the development process used. ... Little's Law deals with averages. It can help us calculate the average waiting time of an item in the system or the average lead time for a work item. In product development, we break the project delivery into a batch (or batches) of work items. Using Anderson’s formula, we can forecast how much time it will take for the batch to be processed by the development system.


SAS automates data modeling for fast analysis
SAS Factory Miner can use any source of data, as long as the data itself can be formatted into a table. The software, run from a server and accessed with a browser, offers a graphical point-and-click interface. It comes with a set of customizable templates for creating baseline models. Analysts can fine tune or revise any of the computer-generated models. To help pick the best models, the software uses a number of machine learning algorithms that, through repeated testing of the models, can recognize patterns to anticipate future performance. One unnamed customer used an early version of the software to build 35,000 different models in order to find the best approach for a marketing campaign.



Quote for the day:

"Success is the result of good judgement, which is the result of experience & experience is often the result of bad judgement" -- tony robbins

June 20, 2015

The APIs.json Discovery Format: Potential Engine in the API Economy
The goal of APIs.json is to provide a simple, common format that can be used to index APIs and the supporting elements of API operations. APIs.json works much like the Sitemap XML format. But instead of indexing websites, APIs.json is designed to index APIs and offer that index at a well-known location where API providers can publish an index of their API resources. APIs.json is designed to give API providers an easy way to update their own index but also allow other search engines, directories, and API service providers access to that local index, making all API resources within the domain discoverable.


Can You Really Define Culture? 4 Lessons From a Growing Startup
Culture is a common theme these days; every startup CEO talks about their amazing culture and how it drives them and inspires their team. Research shows that companies with a high-performance culture have a distinct competitive advantage in part because competitors cannot duplicate your culture like they can copy your technology. Investors are known to invest in the team and often, its underlying culture. One of the key components of a winning team from a venture capital perspective is the clear articulation and proof of that amazing culture. So I find myself now wondering what it really is for our company. How do I define it? And more importantly: How on earth am I going to institutionalize it as we grow?


How to stop the Internet of Things overwhelming your network
The internet can be unreliable and disconnect and reconnect with very little warning. Internet connection speeds can also vary between different clients and devices. The problem is that the IoT assumes the internet is reliable and able to transmit information in real-time. However, this isn’t the case. As human beings, we are notoriously impatient and this is true when it comes to our apps as we want the information we require straight away – internet connections are easily dropped and can often take a while to reconnect. The IoT doesn’t account for this. This is particularly important when it comes to banking apps on a smartphone.


How to structure an outsourced IT project for less risk, more leverage
“This comes into play when implementing a software-as-a-service platform,” says Alpert. “In these implementations there is typically a much smaller software development and testing lifecycle and more focus on agile configuration and testing.” An IT organization may also like the clarity that can accompany working with a sole provider. Unfortunately, “the perceived accountability benefits of ‘one throat to choke’ are typically unrealized due to poor commercial structure and provider unwillingness to accept real risk,” explains Alpert. “With a single provider, future phases of work are often overpriced due to lack of competitive leverage, and the project scope is not yet well defined to determine the discrete schedule, deliverables, requirements, and timeline to hold the provider accountable.”


Three of the worst responses to cyber security threats
A large part of cyber security is monitoring; without monitoring your network, it’s damn near impossible to know which threats you’re facing and what they’re targeting. So, if you get a red flag about a possible intrusion, or several members of staff raise concerns, then you listen and gather all the evidence you can, and come to a conclusion about whether you do something. Or you can do what the follow three organizations did ... “Backing up data is one thing, but it is meaningless without a recovery plan, not only that a recovery plan – and one that is well-practiced and proven to work time and time again,” Code Spaces said. “Code Spaces has a full recovery plan that has been proven to work and is, in fact, practiced.”


IT staff should be embedded in business
“It is an exceptionally lean approach to IT, but it is also extremely flexible in growth and changing situations,” says Alppi. The core IT team also gets some outside help. While not part of Alppi’s five specialists, Rovio has 20 to 30 employees (excluding games developers) with IT-related job descriptions. Instead of having IT as a separate bastion, they work for different units in the company. “Most of our business IT people work inside business units and are our major internal stakeholders. It allows them to be very hands-on with what is happening there. “Typically, anyone with even the slightest association to IT is put into the IT department and then you assign an IT manager to every business unit, but in our model those in charge of business IT also work in business,” says Alppi.


Q&A on Test Driven Development and Code Smells with James Grenning
TDD leads to code that does what the programmer thinks the code is supposed to do. Modules are developed with an executable specification of the module, the test cases. The test cases document very precisely what the code is supposed to do. If the code starts to violate the specification, a test fails. One of the big problems with code is that unwanted side effects are very difficult to anticipate. I make a change to one part of the code, and a seemingly unrelated other part of the code breaks. ... Simply, if you cannot identify with some precision a problem in the code’s structure, how can you fix it. I recall code reviews in my career were usually just a matter of opinion. “I don’t like that, I would have done this”, totally unsupported. Any programmer can announce “this code stinks”, but that is not good enough.


Information Security - Reducing Complexity
There is a drastic change in the threat landscape between now and the 1980s or even 1990s. Between 1980 and 2000, a good anti-virus and firewall solution was considered well enough for an organization. But now those are not just enough and the hackers are using sophisticated tools, technology and sills to attack the organizations. The motive behind hacking has also evolved and in that front, we see that hacking, though illegal is a commercially viable profession or business. ... The driver of adoption of these evolution is the business need. As businesses want to stay ahead of the competition, they leverage the evolving technologies and surge ahead of the competition. With a shorter time to market, all departments, including the security organization should be capable of accepting and implementing such changes at faster pace.


IT Professionals lack confidence in board’s cyber security literacy
“There’s a big difference between cyber security awareness and cyber security literacy,” said Dwayne Melancon, chief technology officer for Tripwire. “If the vast majority of executives and boards were really literate about cyber security risks, then spear phishing wouldn’t work. I think these results are indicative of the growing awareness that the risks connected with cyber security are business critical, but it would appear the executives either don’t understand how much they have to learn about cyber security, or they don’t want to admit that they that they don’t fully understand the business impact of these risks.”


EBay's security chief says collaboration key to keeping data safe from cyberattacks
On a high level there are primarily three reasons that drive hacker activity. The first one is kind of the category that Sony fell into and that is state-sanctioned or government-authorized hacks. And in that scenario they're usually trying to send a message but it's something that allegedly is authorized by a state or a government. The second category is hackers that are looking to monetize their hacks. They're out there hoping to get something they can sell and make money. The third one is really your activist hacker. Those are the ones that want to either deface a website to put their message up. They don't do anything really to extract money. They're just trying to send a message, which also falls into your Sony example.



Quote for the day:

“No one can make you feel inferior without your consent.” -- Eleanor Roosevelt