One big result: 65 percent of respondents said they believe they can stay safe online. But that confidence is coupled with apparent apathy and ignorance. Fifty-eight percent of respondents said they were not taught how to stay safe online (or they weren't sure if they'd been taught Internet security) in school. Sixty-seven percent of respondents said they hadn't heard about any cyberattacks in the past year. The survey was conducted from July 29 to Aug. 10, in the wake of the Office of Personnel Management breach, the Target breach and many other high-profile cyber incidents. It wasn't all bravado. Plenty of respondents said they weren't interested in pursuing a cybersecurity career because they didn't think they had the right skills.
The future of computing is pretty clear to those who know where to look. With everything from our homes to our factories connected and generating more information than can be stored in a single data center, let alone be processed by a human being, the race is on to build computers that can help people make sense of the digital information that threatens to overwhelm them. Intel’s role in this digital overload is threefold. First it wants to put as many chips as it can into what are called the edge devices—the laptops, watches, gateways or any devices that we may interact with or that gathers information from the world and feeds it back to the network. In many ways, because Intel missed out on mobile, it lost out on much of this opportunity.
The difference with Fixie is that it takes a conventions-based approach, which is a benefit as we do not need to use attributes to mark classes and methods as tests. Using other testing frameworks usually involves having to decorate classes and/or methods with attributes that tell the test runner (e.g. Visual Studio Test Explorer) that this is a test that can be executed. With Fixie, this "test discovery" is not enabled through attributes, but rather by following a set of default conventions. Once installed (via NuGet) out of the box, Fixie comes with a set of default conventions that describe test discovery. The first convention has to do with how test classes are named. To create a test class it is simply named as you wish but postfixed with "Tests".
The word disruption has many connotations in the English language. I just didn’t realize how that would create such a wide misapplication of the word “disruption” into things that I never meant it to be applied to. In 1998, the Academy of Management meetings were held in Silicon Valley and the keynote speaker was Andy Grove, then Chairman of Intel. The very first slide in his presentation was about the theory of disruption. During his presentation, Grove said: “We’re not calling it ‘Disruptive Innovation,’ we’re calling it the ‘Christensen Effect.’” They were doing this to be more precise in their language, because they found the term “disruption” to be too broad and easily misapplied.
It's no secret that humans have spent the last few decades using technology to automate as many manual tasks as possible. However, the data center that powers much of this automation is still very much a manual operation in a shockingly large number of organizations. According to a new research report sponsored by Intel DCM, 43% of data centers use manual methods for tasks like capacity planning and forecasting. This State of the Data Center report surveyed 200 data center managers operating in the US and UK. Jeff Klaus, the general manager of data center solutions at Intel, said that he was surprised by how high that number was. ... One potential explanation is that the operators simply do not know what automation capabilities are available to them.
"We've done a lot of work getting cameras and computer vision optimized in the phone space," says Raj Talluri, who oversees mobile computing for Qualcomm. "Typically it's harder in the phone space - a phone has a pinhole camera and is always moving - but now we're bringing that technology into this space where the application is a little different, but the technology we built applies perfectly." Talluri also envisions the reduced delay enabling new applications for home monitoring. "What you have is a much smarter camera," he says. "What I'd call a conscious camera of what's happening in the scene." ... Talluri suggests Qualcomm's tech could go further than that, like knowing to ignore a car that passes by outside a window, all without uploading any footage.
Chase Pay is also promising superior security, a critical selling point after retailers including Target Corp and Home Depot Inc suffered from hacking attacks, Smith said. Longer term, Chase also hopes merchants will offer more discounts through Chase Pay, encouraging consumers to use the technology more. Chase Pay will initially work for consumers that already have Chase credit, debit, and prepaid cards, Smith told Reuters in an interview. There are about 94 million of those cards outstanding now in the United States, and the bank has more spending on them than any other issuer. The app will work on Apple and Android-based phones. JPMorgan Chase's consumer bank has already factored the system's near-term launch costs into its expense estimates, and expects the benefits to come over the medium to long term.
Late last month, the Office of Personnel Management admitted that 5.6 million fingerprints had been stolen from its servers -- not just 1.1 million as had been reported over the summer. Some of these fingerprints belonged to federal employees with secret clearances. Meanwhile, if a password is stolen, it is relatively simple to reset it with a different one. It is currently not practical, however, to provide users with new fingerprints, voices, or eyeballs. That puts biometrics in the same category of data as other permanent personal identifiers, such as Social Security numbers. Since they can retain their value for years -- and will only become more valuable as the use of biometrics expands -- they are likely to become prime targets for hackers. According to Munshani, a better use of biometrics is to save it for second-level controls.
The role of EA traditionally has been one of a strategic endeavor to help the organization anticipate large-scale change that can impact the organization’s revenue and profit margin, to plan for, and implement new (or modified) business & technology capabilities that can address the change effectively. The struggle here is in reconciling the strategic nature of EA with the agile needs that the organization impinges upon EA as a practice. Strategy, by definition, is meant to define a business vision followed by a set of goals, objectives and roadmaps that chart out the organization’s path towards realizing the vision in the medium- to long-term time horizon. Strategies necessarily do take time to develop and need to be vetted thoroughly with the organization’s principal stakeholders, internal and external, before being accepted across the organization (and its partners).
This new modelling technique has been developed because visualizing concurrency in, for example UML, does not offer a satisfying solution. Many software engineers confirmed this and because of this reason the UED has been quickly adopted in many design teams within Philips Healthcare. For example, a sequence diagram is often used to depict a specific use-case, where occasionally limitted thread interaction is included. Other shortcomming of UML with respect to a UED will become clear when reading the rest of the document. The UED depicts the interaction between all threads in a single diagram. A UED can depict all information that is relevant for an execution architecture.
Quote for the day:
"A good manager is a man who isn't worried about his own career but rather the careers of those who work for him." -- H.S.M. Burns