February 26, 2015

Employing Enterprise Architecture for Applications Assurance
An enterprise architect will consolidate individual systems into a service-oriented architecture and eliminate one-off personnel datasets, asset inventories, and expense tracking and reimbursement processes. He or she can promote the adoption of a single technical architecture (such as Microsoft’s .NET or Java Platform, Enterprise Edition) to avoid the headaches of interconnecting incompatible technologies. Portability and interoperability are the key architectural objectives. Each of these actions will reduce the attack surface of the entire enterprise and enable security solutions that scale beyond what would be economical for individual systems.


Will Big Data Make Data Scientists Redundant?
The aim of Google’s Automatic Statistician is to create “an artificial intelligence for data science”. Specifically, it is creating software algorithms that can spot patterns, and report them in simple, easy-to-understand text. For example “The data shows that Saturdays were consistently warmer than Sundays throughout the year and this correlates to higher turnout at outdoor events”. As well as humans, machines can also interpret these results – and use it as the basis for further analysis, by automatically selecting appropriate models and predictions to test it against. The program was developed by a team of scientists at Cambridge University collaborating with others at MIT


Facebook’s Startling Report On The State Of Global Internet Connectivity
Although internet usage exploded over the last decade, this growth rate has slowed as developed nations effectively max out of active citizens. Facebook stated, “The rate of growth declined for the fourth year in a row to just 6.6% in 2014 (down from 14.7% in 2010). At present rates of decelerating growth, it won’t reach 4 billion people until 2019.” The message is a stark one – 40% of the world’s population hasn’t connected to the internet even once. Internet.org point out, “Without the cooperation of industry, governments and NGOs working together to improve the global state of connectivity by addressing the underlying reasons people are not connected to the internet, connectivity may remain permanently out of reach for billions of people.”


Cross-Border Data Restrictions Threatens Global Economic Growth
The motivations vary for restricting cross-border data flows. Some policy makers have embraced data-residency requirements as a tool to protect local tech companies from international competition. Others have pursued restrictions on data flows as a vehicle for safeguarding how consumers' personal information is used and transmitted, concerns that were only exacerbated by the disclosures of former NSA contractor Edward Snowden. Those privacy concerns have been prominent in the European Union, which has historically taken a more protective approach toward users' personal information than U.S. regulators.


Responsible Disclosure: Cyber Security Ethics
The debate over responsible disclosure of vulnerabilities has been going on for years, but has recently been reignited by Microsoft’s decision to end its public advanced notification system, as well as Google’s decision to publish details for a vulnerability found in Windows the day before Microsoft was set to make the patch available. It begs the question once vulnerabilities are discovered, should one disclose them? If so, what’s the appropriate amount of time? Do we as a security community, need to re-examine the process in which we disclose vulnerabilities?


Careful: Don't Drown in Your Data Lake!
In a data lake, everything is just poured in, in an unstructured way. A molecule of water in the lake is equal to any other molecule and can be moved to any part of the lake where it will feel equally at home. This means that data in a lake has a great deal of agility – another word which is becoming more frequently used these days – in that it can be configured or reconfigured as necessary, depending on the job you want to do with it. A data lake contains data in its rawest form – fresh from capture, and unadulterated by processing or analysis. It uses what is known as object-based storage, because each individual piece of data is treated as an object, made up of the information itself packaged together with its associated metadata, and a unique identifier.


Global Open Trusted Technology Provider™ Standard
This is not just a US issue; every country is concerned about securing their critical infrastructures and their underlying supply chains. Unfortunately we are beginning to see attempts to address these global concerns through local solutions (i.e. country specific and disparate requirements that raise the toll on suppliers and could set up barriers to trade). The point is that an international technical solution (e.g. a standard and accreditation program for all constituents in global supply chains), which all countries can adopt, helps address the geo-political issues by having a common standard and common conformance requirements, raising all boats on the river toward becoming trusted suppliers.


Big data trend now being applied to managing human resources
"Statistics have their uses, but you don't want a number to sum up the whole employee- employer relationship," says MacDonald, who works atRyerson University's Ted Rogers School of Management. "It may well be a self-defeating mechanism if you're trying to manage people by being incredibly precise by measuring their performance. If that measurement process becomes demoralizing for staff, you've shot yourself in the foot." So called "big data" techniques have been embraced quickly by sales, finance and marketing departments eager to exploit trends that will help generate more revenue for their organizations. The move towards harnessing computer power has been slower in areas that don't generate profit, such as the personnel department. But that's beginning to change.


Changing Business Value Models
That’s certainly the way that it’s supposed to work. But it’s no good having a great business value model unless it is adequately supported by the right enterprise architecture. Enterprise architecture is the formal description of the components that make up an enterprise, the relationship between those components, and how collectively those components either enable or constrain the management of the organization and the operation of its businesses. The realization of an enterprise architecture is a working, performing system. Problems arise if there is a mismatch of any sort between the ways in which a business is expected to produce value and the ability of the system described by an enterprise architecture to deliver on those expectations.


Building an emergency internet in the white spaces
Comparing white spaces technologies with other radio technologies shows several key advantages. Firstly it's multipoint, with no need for line-of-site connections or to locate end points accurately. That means it's able to operate in high winds and when there are earthquake aftershocks that might cause misalignment of point-to-point systems and might break cables or fibre. Secondly, you don't need to worry about exact alignment, and can even operate through obstructions and over water. The equipment used in the Philippines was weatherproof, and could be installed anywhere, with a normal antenna. Base stations can be used as repeaters, though the current maximum range is an impressive 12km (with future systems promising up to 40km).



Quote for the day:

"Even the demons are encouraged when their chief is "not lost in loss itself." -- John Milton

No comments:

Post a Comment