September 04, 2015

A degree in data science is in demand

The work of a data scientist is really two-fold. First, the data scientist must pull together all this data, which is often just a collection of garbled text or numbers, and clean it up to the point where it can be analyzed. Then, the data scientist has to know how to extract meaningful information from the cleaned-up data. “Big data represents one of the fastest growing areas of business, estimated to become a 17-trillion-dollar industry by 2020," wrote Becker College when it introduced its new data science program earlier this year.  Locally, Worcester Polytechnic Institute and Becker offer data science programs; both convinced that data science is already a desired career path for their students.
WPI's data science program is entering its second year; it currently offers a two-year, graduate-level degree in data science, and this fall, is adding a doctorate-level degree.


US Army’s Cyber War Strategy is Not Just for Military Use

Taking threat sensor data, removing noise and analyzing the data, will provide decision makers with the ability to forecast, gain up-to-date battle damage assessments (BDA) and supply geolocation information of the enemy and the electronic signatures our own forces generate. Convergence is going to be achieved by consolidating its cyber forces operating across multiple departments into single cross operational units removing impediments to information sharing. By fiscal year 2017, the U.S. Army Cyber Command (ARCYBER) will eventually have 41 Cyber Mission Forces operationally capable. They will combine cybersecurity, electronic warfare and signal doctrine into single units. The units will use past lessons learned to develop new doctrines in cyber security.


How Edge Data Center Providers are Changing the Internet’s Geography

Ultimately, location is the main way for companies like EdgeConneX to differentiate from the big colo players like Equinix or Interxion. Edge data center providers are essentially building in tier-two markets what Equinix and its rivals have built in the big core markets: hubs where all the players in the long chain of delivering content or services to customers interconnect and exchange traffic. These hubs are where most of the internet has lived and grown for the bulk of its existence, and edge data center companies are building smaller hubs in places that don’t already have them but are becoming increasingly bandwidth-hungry.


What Do Marketers Really Want in Data and Technology?

You may have heard of Data-as-a-Service (DaaS). Companies are touting DaaS as the next big thing and as a solution that gives marketers an “unfair competitive advantage.” By linking data with technology, DaaS is completely changing the game through a new model of fast-moving and real-time data acquisition. As the name implies, Data-as-a-Service begins with the data. Specifically, a company’s internal data, third party data, real-time fast data, and unique and hard-to-find data (HTFD) sourced from the Big Data ecosystem. With technology this data is structured to create insight into their best customers and ideal prospects. Real-time knowledge is also used to learn about who is actively in market for products and services, who is searching for competitors, or who is posting to social media for product recommendations.


Leveraging COBIT to Implement Information Security (Part 3)

In the context discussed here, it is envisaged that controls within the system are selected by management on a risk-assessed basis to address the perceived threats to the security of the organisation’s core business processes. Once selected, the ISMS is the basis for collecting evidence for operation and reviewing the efficacy of the implementation on an ongoing basis as part of the security forum. The forum is created by senior management, typically the chief executive officer (CEO), as a collaborative round table where managers from IT security, IT, human resources (HR) and major business functions can come together to make decisions on the basis of regular reporting from the system.


Disruptive tech and its impact on wireless protocols and networks

Internet of Things is not a new concept. It's been around for a long time. We used to call it telemetry or sensor-based computing. But the idea that we can do it today at a very low cost and that we can automate so many applications -- medical applications, security, energy management, all kinds of things like that -- means that there's going to be more and more happening on the network over time. And many of those applications will be mobile. (Not everything in IoT is mobile, but a lot of it will be.) So planning for that in terms of capacity, [security and cost is] made more complex. So, even though mobility opens up a lot of opportunities, it does come with a set of costs that we didn't have before.


Indoor positioning – Are we nearly there yet?

If the object you are locating and tracking happens to have a device with some unique identifier attached to it, like a tag or smart phone, things become significantly easier. Now you can have many fixed transmitters sending out pulses, getting received by the device that can then send out a “reply” rather than the reflected pulse that can also contain its unique identifier. The transmitters can be simple and omnidirectional, but then you need a few of them (remember each one defines a circle; in the plane, i.e., in 2D, at least 3 transmitters are needed to determine a unique position) – the determination of a location from measuring distances to a few fixed points is known as Trilateration (check out Multilateration while you’re at it).


Don’t Let Cyberattacks Take A ‘Byte’ Out Of Your Bottom Line

Should a data breach occur, having an incident response plan in place can help ease the pressure in the heat of the moment. Affected systems should immediately be closed off from the remainder of the company’s infrastructure in order to pinpoint the root cause. When a data breach does occur, use it as a learning experience, extracting as much information as possible about how and why the incident occurred. That information can then be used to strengthen IT infrastructure by plugging holes and establishing improved monitoring programs to detect threats. Reaction plans should be tested and updated regularly to ensure any future threat responses are as effective and efficient as possible.



Why Optimization and WANOP for Your Cloud Is Now Easier than Ever

We’re now pushing down rich content, a variety of applications, and a lot of new use cases. The reality here is that cloud will continue to grow as more users and verticals adopt this very versatile platform. In fact, global spending on IaaS is expected to reach almost $16.5 billion in 2015, an increase of 32.8 percent from 2014, with a compound annual growth rate (CAGR) from 2014 to 2019 forecast at 29.1 percent, according to Gartner. The report goes on to state that over time, as a business becomes more comfortable with the use of IaaS, organizations, especially in the midmarket, will eventually migrate away from running their own data centers in favor of relying primarily on infrastructure in the cloud.


Resiliency Testing Best Practices - Report

Every organization must put a plan in place for recover-ability after an outage, but testing your enterprise resilience without full business and IT validation is ineffective. Read the white paper to learn how to put a plan in place for full functional validation, and get details on the importance of validating resiliency in a live environment; learn why small-scale recovery “simulations” are inadequate and misleading; understand why validating resilience demands involvement from IT and the business; and get details on the checks and balances you need to maintain and validate business resilience.



Quote for the day:

"Let a man lose everything else in the world but his enthusiasm and he will come through again to success." -- H. W. Arnold

No comments:

Post a Comment