The enterprise CIO is moving to a consumption-first paradigm
Enter the consumption-first paradigm. Whereas past IT organizations needed to take a build-first methodology out of necessity, today there is a better option. Today, organizations can move to a consume-first paradigm. Within the paradigm, applications and services are evaluated through a consume-first methodology. If the application/ service is not a good fit, then it moves to a configure-first methodology. If all else fails, it falls to build-first. But the goal here is to consume as much as possible without having to build or configure. The evaluation process is as important as changing the paradigm. It is critical to clearly understand what is strategic and differentiating for the company. That then becomes a hallmark for guiding which components present the greatest opportunity for focus and leverage.
Is Social Media Recruiting Dead?
"One key to leveraging the value and reach of social media is to evolve as the technology does," says Kasper. At its core, social recruiting is about going where your audience is, and especially with the millennial generation of job seekers, that can mean many different social media platforms. "Even if some networks might become less popular with the younger workforce, new channels like Instagram or Snapchat are popping up to provide new avenues of engagement for recruiters. As social media evolves, so will social recruiting," Kasper says. That also means spreading out your recruiting efforts using all the technology available, like mobile and even email.
How Intel will embrace the 'delightful chaos' of Internet of Things
Curie represents what is possible for computing in the future, with chips that small being used to power wearable devices and all sort of other objects, Smith said in a visit to CNET's New York office Wednesday. For Intel, which dominates the business of supplying chips to personal computers, the company is eager to search for the next big market to return it to the days of heady growth. But despite all the hype around wearables and IoT, the story of this nascent market is yet unwritten. "What we see in that segment of the market, the Internet of Things, there's lots of innovation going on," Smith said. "If anybody tells you they know who's going to be the winner three years from now, they're making it up, because nobody knows.
Personal Data Stolen Sells for 10X Price of Stolen Credit Card Numbers
The price differential is due to the ability to use identity information – birth dates, Social Security numbers, addresses, employment information, income, etc. – to open new credit accounts on an ongoing basis rather than exploiting just one account until it’s canceled. But that’s not all. “The information attackers were able to access from Anthem are key pieces of data that can be used to access someone’s financial records,” says Eric Chiu, president & co-founder of Hytrust, making it possible to find and drain individuals’ personal cash reserves.
Integration of Big Data Involves Challenges
New types of big data technologies are being introduced to meet expanding demand for storage and use of information across the enterprise. One of those fast-growing technologies is the open source Apache Hadoop and commercial enterprise versions of it that provide a distributed file system to manage large volumes of data. The research finds that currently 28 percent of organizations use Hadoop and about as many more (25%) plan to use it in the next two years. Nearly half (47%) have Hadoop-specific skills to support big data integration. For those that have limited resources, open source Hadoop can be affordable, and to automate and interface with it, adopters can use SQL in addition to its native interfaces; about three in five organizations now use each of these options.
Data Mining Interview: Dr. A. Fazel Famili
In the early days of data mining work, especially applied data mining projects, our challenge was to educate owners of data what data mining was, what would be the advantages of companies opening the door and providing us with their data and access to real-world data. Nowadays, one of the biggest challenges is to make sure that we have access to and understand (properly choose) the attributes that influence the problem that we are investigating. For example, if we our goal is to predict the potential failure of a component in a complex system for which we have access to hundreds of parameters, unless the particular parameter(s) that are associated with the problem are included in our data,
Guidance Regarding Methods for De-identification
This page provides guidance about methods and approaches to achieve de-identification in accordance with the Health Insurance Portability and Accountability Act of 1996 (HIPAA) Privacy Rule. The guidance explains and answers questions regarding the two methods that can be used to satisfy the Privacy Rule’s de-identification standard: Expert Determination and Safe Harbor1. This guidance is intended to assist covered entities to understand what is de-identification, the general process by which de-identified information is created, and the options available for performing de-identification.
Hybrid Cloud Turns IT from a Cost Center into a Revenue Center
There has already been great progress in approaches to federation and interoperability, which are helping to improve hybrid cloud’s cohesion. Some recent progress in building more interoperable cloud environments include the entry of Security Assertion Markup Language (SAML) into OpenStack, the leading open source cloud platform (more on the progress of this effort here). SAML enables single sign-on for Web properties, which makes B2B and B2C transactions much easier. Integrating this into OpenStack-powered clouds allows them to more easily federate with other cloud environments, ultimately giving cloud buyers more hybrid options.
“Cyberspace” must die. Here’s why
The need to abandon the false digital dualism embodied in the term “cyberspace” (hat tip to Nathan Jurgenson and PJ Rey) becomes more urgent as everyday items become connected to the internet. To appreciate how anachronistic the word has become, consider whether your fitness tracker or smart thermostat exists in cyberspace or the real world. When leaked NSA documents talked about strong decryption capabilities as the “price of admission for the U.S. to maintain unrestricted access to and use of cyberspace,” that wasn’t about mastering Neuland. It was about being able to access and exploit the entire connected world, smart homes and all.
The Role of Containers in Modern Applications
With few moving parts having to be deployed with the application, the application becomes easier to support in a production setting. Of course, this is predicated on the fact that the service provider and the service is highly reliable. For example, many applications deployed with Docker will contain its own MySQL instance versus leveraging the MySQL RDS service from Amazon. The latter provides redundancy, availability, backup, etc. all as part of the service, but if deployed as a traditional n-tier application using Docker, all those non-functional requirements continue to burden the operations groups.
Quote for the day:
"The more you criticize , the more you will recognize other's limitations." --@ShawnUpchurch
Enter the consumption-first paradigm. Whereas past IT organizations needed to take a build-first methodology out of necessity, today there is a better option. Today, organizations can move to a consume-first paradigm. Within the paradigm, applications and services are evaluated through a consume-first methodology. If the application/ service is not a good fit, then it moves to a configure-first methodology. If all else fails, it falls to build-first. But the goal here is to consume as much as possible without having to build or configure. The evaluation process is as important as changing the paradigm. It is critical to clearly understand what is strategic and differentiating for the company. That then becomes a hallmark for guiding which components present the greatest opportunity for focus and leverage.
Is Social Media Recruiting Dead?
"One key to leveraging the value and reach of social media is to evolve as the technology does," says Kasper. At its core, social recruiting is about going where your audience is, and especially with the millennial generation of job seekers, that can mean many different social media platforms. "Even if some networks might become less popular with the younger workforce, new channels like Instagram or Snapchat are popping up to provide new avenues of engagement for recruiters. As social media evolves, so will social recruiting," Kasper says. That also means spreading out your recruiting efforts using all the technology available, like mobile and even email.
How Intel will embrace the 'delightful chaos' of Internet of Things
Curie represents what is possible for computing in the future, with chips that small being used to power wearable devices and all sort of other objects, Smith said in a visit to CNET's New York office Wednesday. For Intel, which dominates the business of supplying chips to personal computers, the company is eager to search for the next big market to return it to the days of heady growth. But despite all the hype around wearables and IoT, the story of this nascent market is yet unwritten. "What we see in that segment of the market, the Internet of Things, there's lots of innovation going on," Smith said. "If anybody tells you they know who's going to be the winner three years from now, they're making it up, because nobody knows.
Personal Data Stolen Sells for 10X Price of Stolen Credit Card Numbers
The price differential is due to the ability to use identity information – birth dates, Social Security numbers, addresses, employment information, income, etc. – to open new credit accounts on an ongoing basis rather than exploiting just one account until it’s canceled. But that’s not all. “The information attackers were able to access from Anthem are key pieces of data that can be used to access someone’s financial records,” says Eric Chiu, president & co-founder of Hytrust, making it possible to find and drain individuals’ personal cash reserves.
Integration of Big Data Involves Challenges
New types of big data technologies are being introduced to meet expanding demand for storage and use of information across the enterprise. One of those fast-growing technologies is the open source Apache Hadoop and commercial enterprise versions of it that provide a distributed file system to manage large volumes of data. The research finds that currently 28 percent of organizations use Hadoop and about as many more (25%) plan to use it in the next two years. Nearly half (47%) have Hadoop-specific skills to support big data integration. For those that have limited resources, open source Hadoop can be affordable, and to automate and interface with it, adopters can use SQL in addition to its native interfaces; about three in five organizations now use each of these options.
Data Mining Interview: Dr. A. Fazel Famili
In the early days of data mining work, especially applied data mining projects, our challenge was to educate owners of data what data mining was, what would be the advantages of companies opening the door and providing us with their data and access to real-world data. Nowadays, one of the biggest challenges is to make sure that we have access to and understand (properly choose) the attributes that influence the problem that we are investigating. For example, if we our goal is to predict the potential failure of a component in a complex system for which we have access to hundreds of parameters, unless the particular parameter(s) that are associated with the problem are included in our data,
Guidance Regarding Methods for De-identification
This page provides guidance about methods and approaches to achieve de-identification in accordance with the Health Insurance Portability and Accountability Act of 1996 (HIPAA) Privacy Rule. The guidance explains and answers questions regarding the two methods that can be used to satisfy the Privacy Rule’s de-identification standard: Expert Determination and Safe Harbor1. This guidance is intended to assist covered entities to understand what is de-identification, the general process by which de-identified information is created, and the options available for performing de-identification.
Hybrid Cloud Turns IT from a Cost Center into a Revenue Center
There has already been great progress in approaches to federation and interoperability, which are helping to improve hybrid cloud’s cohesion. Some recent progress in building more interoperable cloud environments include the entry of Security Assertion Markup Language (SAML) into OpenStack, the leading open source cloud platform (more on the progress of this effort here). SAML enables single sign-on for Web properties, which makes B2B and B2C transactions much easier. Integrating this into OpenStack-powered clouds allows them to more easily federate with other cloud environments, ultimately giving cloud buyers more hybrid options.
“Cyberspace” must die. Here’s why
The need to abandon the false digital dualism embodied in the term “cyberspace” (hat tip to Nathan Jurgenson and PJ Rey) becomes more urgent as everyday items become connected to the internet. To appreciate how anachronistic the word has become, consider whether your fitness tracker or smart thermostat exists in cyberspace or the real world. When leaked NSA documents talked about strong decryption capabilities as the “price of admission for the U.S. to maintain unrestricted access to and use of cyberspace,” that wasn’t about mastering Neuland. It was about being able to access and exploit the entire connected world, smart homes and all.
The Role of Containers in Modern Applications
With few moving parts having to be deployed with the application, the application becomes easier to support in a production setting. Of course, this is predicated on the fact that the service provider and the service is highly reliable. For example, many applications deployed with Docker will contain its own MySQL instance versus leveraging the MySQL RDS service from Amazon. The latter provides redundancy, availability, backup, etc. all as part of the service, but if deployed as a traditional n-tier application using Docker, all those non-functional requirements continue to burden the operations groups.
Quote for the day:
"The more you criticize , the more you will recognize other's limitations." --@ShawnUpchurch
No comments:
Post a Comment