Q and A on The Scrum Culture
Bluntly speaking, command and control is not compatible with Scrum. As soon as you allow Scrum to spread throughout the command and control enterprise, there is a clash of cultures and only one will survive. On the one hand command and control is more effective in a production line environment, and it is usually also the dominant approach in the organization. So it has the home field advantage and is the primary source of "gravity", drawing people back to the old way of doing things. The Scrum Culture on the other hand is more effective in development and research environments and is what more and more people demand from their employers.
Can OpenStack free up enterprise IT to support software-driven business?
Although it is often considered as a way to build a private cloud, OpenStack can also be used to provision datacentre hardware directly. Subbu Allamaraju, chief engineer for cloud at eBay, said he would like to use OpenStack as the API for accessing all datacentre resources at the auction site, but the technology is not yet mature enough. Walmart's Junejan added: "We aim to move more markets onto OpenStack and eventually offer datacentre as a service." OpenStack can also be used to manage physical, bare metal server hardware. James Penick, cloud architect at Yahoo, said the internet portal and search engine had been using bare metal OpenStack alongside virtualisation.
Certification, regulation needed to secure IoT devices
Xie explained in an interview with ZDNet that in traditional networks where components such as switches and routers were wired, there were well-established architecture frameworks that outlined where and how firewalls should be connected to switches, be it redundantly or as a single connection. These guidelines would no longer be effective with SDNs where the these "wires" were now defined by software and where switches could be "relocated" by the stroke of a key, he said. Firewalls, instance, would need to continue to operate the necessary policies to secure a database within a SDN, when that database is virtually relocated to a different city. "So all that becomes more intangible, and the big challenge is for security to be able to adapt to that kind of architecture," he noted.
Net Neutrality Rules Forcing Companies To Play Fair, ... Giant ISPs Absolutely Hate It
While the FCC's rules on interconnection are a bit vague, the agency has made it clear they'll be looking at complaints on a "case by case basis" to ensure deals are "just and reasonable." Since this is new territory, the FCC thought this would be wiser than penning draconian rules that either overreach or contain too many loopholes. This ambiguity obviously has ISPs erring on the side of caution when it comes to bad behavior, which is likely precisely what the FCC intended. ... And by "well functioning private negotiation process," the ISPs clearly mean one in which they were able to hold their massive customer bases hostage in order to strong arm companies like Netflix into paying direct interconnection fees. One in which regulators were seen but not heard, while giant monopolies and duopolies abused the lack of last mile competition.
Leaderless Bitcoin Struggles to Make Its Most Crucial Decision
The technical problem, which most agree is solvable, is that Bitcoin’s network now has a fixed capacity for transactions. Before he or she disappeared, Bitcoin’s mysterious creator, Satoshi Nakamoto, limited the size of a “block,” or group of transactions, to one megabyte. The technology underlying Bitcoin works because a network of thousands of computers contribute the computational power needed to confirm every transaction and record them all in a permanent, publicly accessible ledger called the blockchain (see “What Bitcoin Is and Why It Matters”). Every 10 minutes, an operator of one of those computers wins the chance to add a new block to the chain and receives freshly minted bitcoins as a reward. That process is called mining.
Machine learning as a fluid intelligence harvesting service
Developers are only human. They have limited capabilities, attention spans and so on. But data and the knowledge that can be gained from it are seemingly unlimited. Even the world’s data scientists and domain experts have to prioritize their efforts to extract insights from some relevant portion of the vast ocean of information that surges around them. With only so many hours in the day, data scientists and analysts need to leverage every big data acceleration, automation and productivity tool in their arsenals to sift, sort, search, infer, predict and otherwise make sense of the data that’s out there. As a result, many of these professionals have embraced machine learning.
Software development skills for data scientists
You should learn a principle called DRY, which stands for Don't Repeat Yourself. The basic idea is that many tasks can be abstracted into a function or piece of code that can be reused regardless of the specific task. This is more efficient from a "lines of code" perspective, but also in terms of your time. It can be taken to an illogical extreme, where code becomes very difficult to follow, but there is a happy medium to strive for. A good rule of thumb: if you find yourself writing the same line of code with only minor changes each time, think about how you can turn that code into a function that takes the changes as parameters. Avoid hard-coding values into your code. It is also good practice to revisit code you've written in the past to see if the code can be made cleaner, more efficient, or more modular and reusable. This is called refactoring.
Marketing vs. IT: Data Governance Bridges the Gap
The key is to first understand how to govern information in the modern data era – not going back to the stone ages where marketers – and for that matter all business users -- had to follow naming conventions, put everything into schemas and build their work into models. Today, IT teams can empower the data-driven marketing organization by providing better tools and automation across the entire analytic process, including a new class of self-service data preparation solutions, which simplify, automate and reduce the manual steps of the analytic process. This new self-service data preparation “workbench” empowers marketing, sales, finance and business operations analysts with a shared environment that captures how they work with data, where they get it from and ultimately what BI tool they use to analyze it.
Full Stack Web Development Using Neo4j
Neo4j is a Graph database which means, simply, that rather than data being stored in tables or collections it is stored as nodes and relationships between nodes. In Neo4j both nodes and relationships can contain properties with values. ... While Neo4j can handle "big data" it isn't Hadoop, HBase or Cassandra and you won't typically be crunching massive (petabyte) analytics directly in your Neo4j database. But when you are interested in serving up information about an entity and its data neighborhood (like you would when generating a web-page or an API result) it is a great choice. From simple CRUD access to a complicated, deeply nested view of a resource.
Executive's guide to the hybrid cloud (free ebook)
Hybrid strategies have begun making inroads in several industries, including the financial sector, healthcare, and retail sales. In a widely cited report, Gartner predicted that nearly 50 percent of enterprises will have hybrid cloud deployments by 2017. Hybrid clouds can help ensure business continuity, allow provisioning to accommodate peak loads, and provide a safe platform for application testing. At the same time, they give companies direct access to their private infrastructure and let them maintain on-premise control over mission-critical data. Is hybrid an ideal strategy for all companies — or a panacea for all cloud concerns? ... This ebook will help you understand what hybrid clouds offer, and where their potential strengths and liabilities exist.
Quote for the day:
“It’s what you do in your free time that will set you free—or enslave you.” -- Jarod Kintz
Bluntly speaking, command and control is not compatible with Scrum. As soon as you allow Scrum to spread throughout the command and control enterprise, there is a clash of cultures and only one will survive. On the one hand command and control is more effective in a production line environment, and it is usually also the dominant approach in the organization. So it has the home field advantage and is the primary source of "gravity", drawing people back to the old way of doing things. The Scrum Culture on the other hand is more effective in development and research environments and is what more and more people demand from their employers.
Although it is often considered as a way to build a private cloud, OpenStack can also be used to provision datacentre hardware directly. Subbu Allamaraju, chief engineer for cloud at eBay, said he would like to use OpenStack as the API for accessing all datacentre resources at the auction site, but the technology is not yet mature enough. Walmart's Junejan added: "We aim to move more markets onto OpenStack and eventually offer datacentre as a service." OpenStack can also be used to manage physical, bare metal server hardware. James Penick, cloud architect at Yahoo, said the internet portal and search engine had been using bare metal OpenStack alongside virtualisation.
Certification, regulation needed to secure IoT devices
Xie explained in an interview with ZDNet that in traditional networks where components such as switches and routers were wired, there were well-established architecture frameworks that outlined where and how firewalls should be connected to switches, be it redundantly or as a single connection. These guidelines would no longer be effective with SDNs where the these "wires" were now defined by software and where switches could be "relocated" by the stroke of a key, he said. Firewalls, instance, would need to continue to operate the necessary policies to secure a database within a SDN, when that database is virtually relocated to a different city. "So all that becomes more intangible, and the big challenge is for security to be able to adapt to that kind of architecture," he noted.
Net Neutrality Rules Forcing Companies To Play Fair, ... Giant ISPs Absolutely Hate It
While the FCC's rules on interconnection are a bit vague, the agency has made it clear they'll be looking at complaints on a "case by case basis" to ensure deals are "just and reasonable." Since this is new territory, the FCC thought this would be wiser than penning draconian rules that either overreach or contain too many loopholes. This ambiguity obviously has ISPs erring on the side of caution when it comes to bad behavior, which is likely precisely what the FCC intended. ... And by "well functioning private negotiation process," the ISPs clearly mean one in which they were able to hold their massive customer bases hostage in order to strong arm companies like Netflix into paying direct interconnection fees. One in which regulators were seen but not heard, while giant monopolies and duopolies abused the lack of last mile competition.
Leaderless Bitcoin Struggles to Make Its Most Crucial Decision
The technical problem, which most agree is solvable, is that Bitcoin’s network now has a fixed capacity for transactions. Before he or she disappeared, Bitcoin’s mysterious creator, Satoshi Nakamoto, limited the size of a “block,” or group of transactions, to one megabyte. The technology underlying Bitcoin works because a network of thousands of computers contribute the computational power needed to confirm every transaction and record them all in a permanent, publicly accessible ledger called the blockchain (see “What Bitcoin Is and Why It Matters”). Every 10 minutes, an operator of one of those computers wins the chance to add a new block to the chain and receives freshly minted bitcoins as a reward. That process is called mining.
Machine learning as a fluid intelligence harvesting service
Developers are only human. They have limited capabilities, attention spans and so on. But data and the knowledge that can be gained from it are seemingly unlimited. Even the world’s data scientists and domain experts have to prioritize their efforts to extract insights from some relevant portion of the vast ocean of information that surges around them. With only so many hours in the day, data scientists and analysts need to leverage every big data acceleration, automation and productivity tool in their arsenals to sift, sort, search, infer, predict and otherwise make sense of the data that’s out there. As a result, many of these professionals have embraced machine learning.
Software development skills for data scientists
You should learn a principle called DRY, which stands for Don't Repeat Yourself. The basic idea is that many tasks can be abstracted into a function or piece of code that can be reused regardless of the specific task. This is more efficient from a "lines of code" perspective, but also in terms of your time. It can be taken to an illogical extreme, where code becomes very difficult to follow, but there is a happy medium to strive for. A good rule of thumb: if you find yourself writing the same line of code with only minor changes each time, think about how you can turn that code into a function that takes the changes as parameters. Avoid hard-coding values into your code. It is also good practice to revisit code you've written in the past to see if the code can be made cleaner, more efficient, or more modular and reusable. This is called refactoring.
Marketing vs. IT: Data Governance Bridges the Gap
The key is to first understand how to govern information in the modern data era – not going back to the stone ages where marketers – and for that matter all business users -- had to follow naming conventions, put everything into schemas and build their work into models. Today, IT teams can empower the data-driven marketing organization by providing better tools and automation across the entire analytic process, including a new class of self-service data preparation solutions, which simplify, automate and reduce the manual steps of the analytic process. This new self-service data preparation “workbench” empowers marketing, sales, finance and business operations analysts with a shared environment that captures how they work with data, where they get it from and ultimately what BI tool they use to analyze it.
Full Stack Web Development Using Neo4j
Neo4j is a Graph database which means, simply, that rather than data being stored in tables or collections it is stored as nodes and relationships between nodes. In Neo4j both nodes and relationships can contain properties with values. ... While Neo4j can handle "big data" it isn't Hadoop, HBase or Cassandra and you won't typically be crunching massive (petabyte) analytics directly in your Neo4j database. But when you are interested in serving up information about an entity and its data neighborhood (like you would when generating a web-page or an API result) it is a great choice. From simple CRUD access to a complicated, deeply nested view of a resource.
Executive's guide to the hybrid cloud (free ebook)
Hybrid strategies have begun making inroads in several industries, including the financial sector, healthcare, and retail sales. In a widely cited report, Gartner predicted that nearly 50 percent of enterprises will have hybrid cloud deployments by 2017. Hybrid clouds can help ensure business continuity, allow provisioning to accommodate peak loads, and provide a safe platform for application testing. At the same time, they give companies direct access to their private infrastructure and let them maintain on-premise control over mission-critical data. Is hybrid an ideal strategy for all companies — or a panacea for all cloud concerns? ... This ebook will help you understand what hybrid clouds offer, and where their potential strengths and liabilities exist.
Quote for the day:
“It’s what you do in your free time that will set you free—or enslave you.” -- Jarod Kintz
No comments:
Post a Comment