November 16, 2014

How to Become a Data Scientist in 8 Easy Steps
Our friends over at DataCamp just came out with a cool new infographic entitled “Become a Data Scientist in 8 easy steps.” This hits home to a lot of people who are trying to enter this new industry hoping to satisfy a lot of unfilled job openings. The question is how best to make this transition. The useful infographic below will help answer this question by outlining the process of becoming a data scientist ... These are all excellent tips, so examine the infographic carefully for more detail. You too can become part of the “sexiest job of the 21st Century!”


Search for Growth in Social, Mobile Fuels Tech M&A Boom
“Now it’s disruptive technology that’s in the crosshairs,” Liu said. “Consolidation involves coporations needing to catch up in a way that they are not able to do fast enough orignaically.” The aggregate global value of all publicly disclosed-value deals set a new post-dotcom era quarterly high of US$73.7 billion [b], up 41 percent sequentially and 4 percent year over year. At 923 deals in total, overall volume also set a record for any quarter since 2000, rising 6 percent sequentially and 31 percent year over year. Corporations, as opposed to private equity deals, continue to drive the growth, increasing aggregate value 40 percent sequentially and 9 percent year over year to $65.3 billion.


IoT Won’t Work Without Artificial Intelligence
The big problem will be finding ways to analyze the deluge of performance data and information that all these devices create. If you’ve ever tried to find insight in terabytes of machine data, you know how hard this can be. It’s simply impossible for humans to review and understand all of this data – and doing so with traditional methods, even if you cut down the sample size, simply takes too much time. We need to improve the speed and accuracy of big data analysis in order for IoT to live up to its promise.


What Every Business Owner Needs to Know About Data Sovereignty
Unfortunately, the laws and regulations protecting digital information can be extremely complex. They are dependent on different governments and jurisdictions, and data stored in certain countries may or may not be subject to subpoena by another country’s government. As an IT professional, you’re likely responsible for ensuring that your company’s data is fully protected. However, you need to provide your business’s owner with the basics to enable him or her to make the best decisions for the company — and the valuable data it possesses. For those who don’t work with technology all day, however, the variables can be overwhelming.


Collective intelligence, big data and IEML
There are two big problems with this landscape: The first is related to the methodology; today we use mainly statistical methods and logical methods. It is very difficult to have a semantic analysis of the data, because we do not have a semantic code, and let’s remember that every thing we analyze is coded before we analyze it. ... So you need a semantic code to have a semantic analysis. We do not have it yet, but I think that IEML will be that code. The second problem is the fact that this analysis of data is currently in the hands of very powerful or rich players –big governments, big companies. It is expensive and it is not easy to do –you need to learn how to code, you need to learn how to read statistics, is not easy.


MSSP: Integrate, NOT Outsource!
This means that for the MSSP to work well for you, process integration must be carefully planned. Here we talked about the alert response integration (and here about the SLAs), but the same applies to device management (integrate with your change management and reporting),incident response (integrate with your IR) and many other processes. This also means that this focus on integration allows you to vary the degree of security ‘outsourcing’ or externalization. If your plan – monitor – triage – respond – refine chain is well planned, you can almost painlessly engage external resources (MSSP, consultants, etc) at whatever stage: need more help with cleaning the mess? Call that IR consultant. Want to shift some perimeter monitoring duties outside? Go get that MSSP.


Requirements Discovery and Constraints Analysis
The process of requirements discovery broadly involves elicitation of functional and non-functional requirements from business needs. A business or enterprise architect’s role in requirements discovery is wider and broader in terms of scope, responsibility and, nature and stage of engagement. ... The nature of business concerns will not be limited to problems addressable by technology solution but also include considerations such as investments, ROI (Return on Investments), business case, timelines, priorities, risks and solution strategies potentially involving an eco-system of internal and external stakeholders (e.g. technology providers).


Simulation-Based Embedded Agile Development
While simulations containing embedded software need not be developed in an agile manner, Scrum’s agile framework helps realize greater benefits from a SiS approach. One Scrum event is the sprint review, in which the development team demonstrates what was accomplished during the sprint. It can be challenging to have something visual to demonstrate with embedded software development as there is often little to “see.” We might get only a blinking light or a wiggling fin. ... When such feedback is used in the sprint review as well as daily collaboration, these collective learning opportunities allow more nimble responses to necessary changes in requirements and design.


BlazeMeter, New Relic Team Up To Deliver Richer App Performance Testing Analytics
“Data analysis is most valuable when you can understand and act upon it instantly. Testing makes it easy to trigger a symptom, but you need monitoring to identify the root problem in the first place,” Girmonsky told IDN. “Together, BlazeMeter and New Relic provide their customers a full 360-degree view of their systems. Customers can dynamically define the KPIs they want to analyze, query the application and instantly understand the specific quirks of their system,” he added. The growing BlazeMeter/New Relic partnership is also a sign of how IT is increasing its use of machine data and big data to improve their software lifecycle -- design, development, testing and operations.


Optimizing Enterprise Risk for Value Creation
With IT risk being a subset of Enterprise risk, and given the pervasiveness of technology within the business, optimizing IT risk has a direct and positive effect on the overall risk of the organization. So important is risk optimization of the Enterprise’s IT to the organization that within COBIT 5 there is not one, but two, dedicated processes - ‘Ensure Risk Optimization’and ‘Manage Risk’.  The Ensure Risk Optimization process is within the Governance area of the COBIT 5 framework and is supported by 3 governance practices and 16 activities. The process ensures that the enterprise’s risk appetite and tolerance are understood and not exceeded by Enterprise IT, the impact of IT risk to enterprise value is identified and managed, and the potential for compliance failures is minimized.



Quote for the day:

"Take the first step in faith. You don't have to see the whole staircase, just take the first step." -- Martin Luther King Jr.

No comments:

Post a Comment