September 25, 2016

By starting with a contract or test case that is well understood to incrementally test a feature requirement, you ensure that as a small iterative unit of work completes, it meets that contract in such a way that it is releasable, deployable software. Exploratory testing tools for new feature development come to play as do coverage tools that send data showing anomalies between releases back to the quality process. Coveralls.io is a great tool that’s easy to configure and has wonderful visualizations for the most popular languages, while Jenkins has a highly customizable dashboard. ... Technology can’t solve all problems, however, so developers and testers will need to change some of their workflows to master CT. These concepts are closely linked to the Agile and DevOps practices you are probably already using, so adapting testing in this way should not be a huge shift.


Google Allo: Don't use it, says Edward Snowden

Allo does support end-to-end encryption, which should make it difficult for anyone but recipient and sender to view the contents of messages; however, Google was criticized by Snowden and other privacy advocates for setting it as off by default. Allo relies on the encryption protocol used by Signal, which Snowden has vouched for as a private messaging app, but in Allo it is only active when users are in Incognito Mode. "We've given users transparency and control over their data in Google Allo. And our approach is simple -- your chat history is saved for you until you choose to delete it. You can delete single messages or entire conversations in Allo," Google said in a statement toTechCrunch.


Investing in AI offers more rewards than risks

While some may argue it’s impossible to predict whether the risks of AI applications to business are greater than the rewards (or vice versa), analysts predict that by 2020, 5 percent of all economic transactions will be handled by autonomous software agents. The future of AI depends on companies willing to take the plunge and invest, no matter the challenge, to research the technology and fund its continued development. Some are even doing it by accident, like the company that paid a programmer more than half a million dollars over six years, only to learn he automated his own job. Many of the AI advancements are coming from the military. The U.S. government alone has requested $4.6 billion in drone funding for next year, as automated drones are set to replace the current manned drones used in the field.


Crowdsourcing Data Governance

This variation of context is why the right operating model set up is so important for any data governance initiative, especially the ones that are just getting started. A successful data governance initiative will bring change, and so time becomes yet another dimension for the context. I’ve seen it happen many times: organizations launch with a best-in-class operating model to drive their stewardship. They gain adoption, and the resulting change makes the original operating model obsolete, or rather stretches it to the limit. This is why I am absolutely convinced that a data governance platform that aims to be successful needs a capability for operating model configuration: your roles, responsibilities, workflows, dashboards, views, use cases, and more.


Bossie Awards 2016: The best open source application development tools

or years and years, we’ve been building applications that collect data from the users and serve it back to them. We’re finally starting to do something with that data. Along with the best open source tools for building web apps, native apps, native mobile apps, and robotics and IoT apps, this year’s Bossie winners in application development include top projects for data analysis, statistical computing, machine learning, and deep learning. After all, if our applications can be reactive, responsive, and even “ambitious,” they can also be intelligent.


Is this the age of Big OLAP?

What has dogged OLAP, though, is its scalability. Most OLAP servers run on single, albeit beefy, servers, which limits the parallelism that can be achieved and therefore imposes de facto limits on data volumes. Customers who hit these scalability ceilings may contemplate using Big Data technologies, like Hadoop and Spark, but those tend not to employ the dimensional paradigm to which OLAP users are accustomed. What to do? Well, a few vendors have decided to take Hadoop and Spark, and leverage them as platforms on which big OLAP cubes can be run and built. ... Their approach has been to let people in those enterprises work in the OLAP environments they are comfortable with and, at the same time, make use of their Hadoop clusters.


Deep Learning in a Nutshell: Reinforcement Learning

Reinforcement learning is about positive and negative rewards (punishment or pain) and learning to choose the actions which yield the best cumulative reward. To find these actions, it’s useful to first think about the most valuable states in our current environment. For example, on a racetrack the finish line is the most valuable, that is the state which is most rewarding, and the states which are on the racetrack are more valuable than states that are off-track. Once we have determined which states are valuable we can assign “rewards” to various states. For example, negative rewards for all states where the car’s position is off-track; a positive reward for completing a lap; a positive reward when the car beats its current best lap time; and so on.


NYDFS Proposed Cybersecurity Regulation for Financial Services Companies

The goal of the Proposed Regulation is to secure “Nonpublic Information” from misuse, disruption and unauthorized access, and as noted above, such information is defined broadly. It includes not only competitively sensitive information and intellectual property, but also numerous categories of information that a Covered Entity receives from or about consumers, including information considered nonpublic personal information under the GLBA Privacy Rule. ...” When something goes wrong, the Covered Entity must report it to the Superintendent. Specifically, any attempt or attack “that has a reasonable likelihood of materially affecting the normal operation of the Covered Entity or that affects Nonpublic Information” must be reported to the Superintendent within 72 hours after the Covered Entity becomes aware of the event.


Big Data Processing with Apache Spark - Part 5: Spark ML Data Pipelines

Machine learning pipelines are used for the creation, tuning, and inspection of machine learning workflow programs. ML pipelines help us focus more on the big data requirements and machine learning tasks in our projects instead of spending time and effort on the infrastructure and distributed computing areas. They also help us with the exploratory stages of machine learning problems where we need to develop iterations of features and model combinations. Machine Learning (ML) workflows often involve a sequence of processing and learning stages. Machine learning data pipeline is specified as a sequence of stages where each stage is either a Transformer or an Estimator component. These stages are executed in order, and the input data is transformed as it passes through each stage in the pipeline.


How fintech startups can disrupt the financial services industry

Successful fintech startups will embrace “co-opetition” and find ways to engage with the existing ecosystem of established players. E.g. PayPal partners with Wells Fargo for merchant acquisition. Some business lending platforms enable banks to participate as credit providers on their platforms. Conversely, some banks partner with P2P lending platforms to provide credit to those borrowers who would ... Fintech startups are flying under the regulatory radar so far. However that may change in the near future. Regulatory tolerance for lapses on issues such as know your customer, compliance, and credit-related disparate impact will be low. Experience of microfinance industry in many developing countries the past is a good indicator of the high impact of regulation on an unregulated industry.



Quote for the day:


"It is better to be defeated standing for a high principle than to run by committing subterfuge." -- Grover Cleveland


September 24, 2016

Implementing DevOps starts with rethinking deeply-rooted processes

DevOps expands enterprise agile from product management and development all the way to IT operations. We’ve done enterprise agile. We now create apps in a way that focuses on the client and maximizes throughput and quality. But that hits a barrier when you get to traditional IT operations. The changes that will be taking place over next five years create the ability to take enterprise agile into IT operations. It means improved platforms, improved automation, improved collaboration across development and ops. It means a constant flow of value into production. ... Rather than throwing large releases over the wall to operations, how can we bring teams together to identify tooling, processes, and practices that can be deployed to automate provisioning and production deployment fully? This significantly improves time to market by increasing throughput and quality.


I got 99 data stores and integrating them ain't fun

The concept has been around for a while and is used by solutions like Oracle Big Data. Its biggest issues revolve around having to develop and/or rely on custom solutions for communication and data modeling, making it hard to scale beyond point-to-point integration, Could these issues be addressed? Data integration relies on mappings between a mediated schema and schemata of original sources, and transforming queries to match original sources schema. Mediated schemata don't have to be developed from scratch -- they can be readily reused from a pool of curated Linked Data vocabularies. Vocabularies can be mixed/matched and extended/modified to suit custom needs, balancing reuse and adaptability.


The next target for phishing and fraud: ChatOps

Like many cloud platforms, chat tools allow external organizations to leverage internal APIs to extend functionality, ranging from scheduling assistants to travel booking tools to various engineering and product management systems. Overall, this extensibility represents a core strength of these systems. From a security perspective, however, they can represent data exfiltration opportunities that must be addressed. First, not every third party company is a good steward of the data they have access to; corporate policies for vendor review and acceptable use should apply to chat programs in the same way that they do for any system. As with the GSA example, relying on users to understand the technological limitations and risks around connecting technologies is not a strong strategy.


Why Fintech has made finance courses obsolete

Today, it’s a lot more complicated because we don’t know what will be the jobs of the next 10 or 20 years. So it’s a lot harder to be passive, and I think you have to be a lot more active. As a piece of advice, if I were 20, there are two things I would do. The first thing, from an education standpoint, you have to learn something, but you have to learn something you like. Just because you want to learn fintech doesn’t mean you have to code. So you really have to learn what you like on the education standpoint, and the second thing, I think it’s about the mindset. It means that to avoid being passive but be very active, for me the best quality to have today is being able to think like an entrepreneur. You don’t necessarily need to be an entrepreneur, you may work for a big company, but you have to think like an entrepreneur.


Computers could develop consciousness and may need 'human' rights, says Oxford professor

Advances in artificial intelligence could lead to computers and smartphones developing consciousness and they may need to be given ‘human’ rights, an expert has claimed. Marcus du Sautoy, who took over from Richard Dawkins as Professor for the Public Understanding of Science at Oxford University in 2008 said it was now possible to measure consciousness and, in the future, technology could be deemed to be ‘alive.’ Most scientists believe that computers are close to getting to a point where they begin to develop their own intelligence and no longer need to be programmed, an event dubbed the ‘technological singularity.’


Why CMOs need to care about security like CIOs

While some marketers will view consumer grade file and sync platforms such as Dropbox or WeTransfer as a swift business panacea, the risk that these platforms open up for data breaches with uncontrolled sharing are high. Today, the ‘data perimeter’ – the boundary that safeguards an organization’s sensitive data – has shifted considerably. This is a result of a more mobilized workforce and greater collaboration with external partners. In the past, when most workers only accessed company information from within the four walls of the business and data was saved on shared drives from PCs located in the enterprise, the perimeter was the firewall. Since the advent of cloud computing, this has changed. In today’s connected world, the data perimeter needs to reside within individual documents, in addition to within the IT infrastructure.


Three Industrial Internet of Things (IIoT) myths that need busting

Unlike consumer markets where standardisation - formal or by market dominance - is key to success, IIoT standardisation won’t be a concern for decades. Sure, there are multiple emerging standardisation initiatives in IIoT and yes, it’s not yet possible to know which will grow or be marginalised. But it doesn’t matter. Unlike consumer markets where new standards for say NFC chips in smartphones can roll out and get near full market presence in the few years it takes for people to replace their phones, industries are run on equipment that is anything from years to several decades old. This equipment has been provided by tens, or hundreds of different suppliers.


It’s time for drivers to learn new skills

Experts predict that 6% of all jobs in the US will be gone by 2021 due to automation. The former CEO of McDonald’s sees replacing the whole company’s restaurant workers as a simple question of economics. From software and legal help to sports reports and parcel delivery, there’s few jobs that won’t see some sort of reduction in the world force. But the world’s drivers could be most at risk. Despite continued fear from the public, the money men at taxi, logistics, and delivery companies will have no such fear at deploying autonomous vehicles on the road. No wages, greater fuel efficiency, no worries about shift work or rest stops. It might be cold, but in terms of business sense it’s hard logic to argue with.


How Li-Fi Will Disrupt Data Centres ‘Very Ugly Radio Environment’

“Most of that is using fiber optic cables and not free space transmission (which Li-Fi seems to be). The total capacity in a data centre far exceeds anything that could be done by a shared system (same is true of radio versus use of copper cables).” He continued to say that there is also a need for shared management communications within the data centre (things like DCIM where someone wants “out of band” communications with the hardware, for example). Giving an example of recently carried out research around the use of Li-Fi in the data centre, Christy mentioned Microsoft’s innovative work where the tech giant complemented the “wired” network with a broadcast network (in the data centre) that could be implemented either with radio or which light transmission bounced off the ceiling.


Industrial IoT is inching toward a consensus on security

Immature security is the biggest thing delaying adoption of industrial IoT, said Jesus Molina, co-chair of IIC’s security working group, in an interview. Components commonly used in enterprise IT security, like identity and root of trust, don't really exist yet in IoT, he said. There are several components to making anything in IoT trustworthy, the framework says: safety, reliability, resilience, security and privacy. These issues come up because industrial IoT connects so many components, including things like sensors and actuators at the edge of an enterprise, that didn’t exist or weren’t connected to the internet up until now. Those edge connections can open up dangerous vulnerabilities, because they’re often designed to carry some of the most sensitive information in an organization.



Quote for the day:


"He who rejects change is the architect of decay." -- Harold Wilson


September 23, 2016

Infrastructure as code: What does it mean and why does it matter?

Code forms the backbone of this approach, giving rise to the term infrastructure as code (IaC), which, in simple terms, means code that helps in provisioning systems out onto an IT platform.  Today, IaC has grown to be full-function and highly flexible, and there are several variants to consider, including declarative, imperative and intelligent IaC. The declarative approach creates a required state and adapts the target infrastructure to meet those conditions, while the imperative version creates a target environment based on hard definitions set out within the script. The intelligent state, meanwhile, takes into account other pre-existing workloads within the target environment, and reports back to a system administrator about any problems it encounters.


Open source technology gains steam in data center, but challenges loom

When deploying or running open source technology, the lack of professional support can leave IT scrambling. Even after combing through search engine results and discussion boards, admins still might not have an answer for an urgent question. Professional support is lacking with open source tools, and although some vendors offer support services, they often comes at a cost. When a primary driver to switch to open source is the financial aspect, spending money on the necessary support can create a dilemma. Some larger companies have the resources -- both from a financial and staffing standpoint -- to support open source hardware and software in the data center, but smaller organizations often struggle to do so.


Can Armies Of Interns Close The Cybersecurity Skills Gap?

Since cybersecurity is a relatively new field, professionals in the sector tend to pick up expertise on the job. It's only more recently that universities have started seriously ramping up programs. But BullGuard finds that's been happening internationally, not just in the U.S., so it's making moves to tap into those talent pipelines pretty much as soon as they're constructed. With its new Romania-based internship program, Lipman explains, "We took computer science students with cyberexperience in their college studies, and put them into our more innovative projects over the summer. It’s been a real win-win. We get access to new blood [and] fresh thinking, the interns get valuable real-world experience, and we build a relationship with the university." Establishing this ability to "hire straight out of college,"


CQRS for Enterprise Web Development: What's in it for Business?

The CQRS pattern is widely acclaimed by advocates of Domain Driven Design. The approach emphasizes solving business problems in the first place during the implementation of an application. It centers on thorough elaboration of a business domain and the context within which it will function. The possibility to focus on the business first rather than on the technical issues and work out all the nuances pertinent to a specific domain is achieved through the use of the Ubiquitous language – a single language understood by an implementation team, business analysts, domain experts and other parties involved. The language helps to share the effort among all team members – business and technical – who define and agree on the use of common business objects to describe the solution’s domain model and a certain context within such a model.


The changing data protection paradigm

The amount of new data available is staggering. As the Harvard Business Review aptly put it, "More data cross the internet every second than were stored in the entire internet just 20 years ago." This data has varying degrees of value and sensitivity, and resides on a variety of systems, including endpoints, removable media, local servers, cloud servers, and cloud-based services like Box and Dropbox. This growth and spread of data has quickly exceeded the ability of most companies to keep track of it, let alone protect it. This massive influx of data, spread out among various locations, has naturally brought with it increasing security exposures, leading to an almost daily data breach crisis.


NIST launches self-assessment tool for cybersecurity

It's designed to walk organizations through the process of figuring out "how to integrate cybersecurity risk management ... into larger enterprise business practices and processes," Matthew Barrett explained to FedScoop. Barrett is the program manager for the NIST Cybersecurity Framework — a document that catalogues the five areas of cybersecurity every company needs to know: identify, protect, detect, respond and recover. ... "The self-assessment criteria are basic enough that they could apply to organizations of any size," said Barrett. But critics aren't so sure. Larry Clinton, founder and CEO of the alliance, called the excellence builder "a pretty sophisticated tool," but added that meant it was really most useful to larger enterprises.


Ideas for Filling the Cybersecurity Skills Gap

Studies over the years show the struggle in building an IT security staff. For example, a GAO survey earlier this year of federal agencies' CISOs reveals their difficulties in recruiting, hiring and retaining security personnel. Wilshusen says the problem of maintaining a sufficient security staff makes it more challenging for agencies to effectively carry out their responsibilities. In building the federal government's cybersecurity workforce, Pritzker suggests the commission consider recommending a centralized system to recruit, train and place federal cybersecurity personnel as well as creating specialized pay scales to compete with the private sector. "We need to rethink recruitment with bold ideas like debt forgiveness for graduates of certified programs, tuition-free community college in return for federal service and cybersecurity apprenticeships within civilian agencies," the Commerce secretary says.


This Is how you stop ignoring the employee voice

In case you forgot, your employees are human. They are all living, breathing, feeling beings who deserve a bit of human interaction. Take the time to meet regularly and face-to-face with your employees. This not only gives you and your team members a chance to catch up on their performance, but also allows employees to share opinions or issues they are facing. Airing those grievances face-to-face lets employees see their manager’s reaction, as well as have an immediate discussion about what can and will be done. Now you might be thinking, “But an email thread is sooo much easier!” It’s also lazier. And might be negatively affecting employee engagement.


Serverless Architectures: The Evolution of Cloud Computing

Serverless architectures are a natural extension of microservices. Similar to microservices, serverless architecture applications are broken down into specific core components. While microservices may group similar functionality into one service, serverless applications delineate functionality into finer grained components. Custom code is developed and executed as isolated, autonomous, granular functions that run in a stateless compute service. ... For a serverless architecture, the “User” service would be separated into more granular functions. In Figure 2, each API endpoint corresponds to a specific function and file. When a “create user” request is initiated by the client, the entire codebase of the “User” service does not have to run; instead only create_user.js will execute.


Why Red Hat is misunderstood amid public cloud worries

Any customer that bets on a cloud stack and uses proprietary APIs is going to have some form of lock-in. That's why OpenStack is such a popular movement. Now let's take those nuances back to Red Hat. "What we're seeing is that large customers see value in running everywhere," said Whitehurst. "These customers want a standard operating environment and want to take Linux with them as they go cloud." Worrywarts about Red Hat will argue that a move to the public cloud means that AWS will get the Linux business. Not necessarily. "As more goes to the public cloud the more relevant we get," Whitehurst argued. "If you are moving to Amazon you have to architect it so you're not locked in. Large enterprises feel burned out about being locked in."



Quote for the day:


"Not all of us can do great things. But we can do small things with great love." -- Mother Teresa


September 22, 2016

Over 6,000 vulnerabilities went unassigned by MITRE's CVE project in 2015

Why does MITRE not have assignments for vulnerabilities identified via other sources? Why haven't the CNAs shared their own disclosures with MITRE so that CVE can reflect the information, instead of leaving entries in RESERVED status, which shows nothing? Why aren’t CNAs assigning IDs to all of the vulnerabilities they disclosed, since some of the unassigned vulnerabilities are in their products? VulnDB shows 14,914 vulnerabilities disclosed in 2015. Within that set, only 8,558 vulnerabilities have CVE-IDs assigned to them. That leaves 6,356 vulnerabilities with no CVE-ID, and likely no representation in a majority of security products. ... While these numbers are bad, what's worse is that the industry has already felt the impact of an attack against a vulnerability that wasn't assigned a CVE-ID.


EastWest Institute Launches Cybersecurity Guide for Technology Buyers

“As cybersecurity vulnerabilities continue to increase, every corporation and government needs guidance to better understand the impact of their purchasing decisions on the security and integrity of their enterprises,” said Steve Nunn, CEO and President, The Open Group. “Every organization should be questioning their suppliers concerning risk management, product development, cyber and supply chain security and best practices. This Buyers Guide supports conformance with international standards and, where appropriate, process-based certification programs that help answer some of these critical questions.”


Lockdown! Harden Windows 10 for maximum security

Windows 10 also introduces Device Guard, technology that flips traditional antivirus on its head. Device Guard locks down Windows 10 devices, relying on whitelists to let only trusted applications be installed. Programs aren’t allowed to run unless they are determined safe by checking the file’s cryptographic signature, which ensures all unsigned applications and malware cannot execute. Device Guard relies on Microsoft’s own Hyper-V virtualization technology to store its whitelists in a shielded virtual machine that system administrators can’t access or tamper with. To take advantage of Device Guard, machines must run Windows 10 Enterprise or Education and support TPM, hardware CPU virtualization, and I/O virtualization. Device Guard relies on Windows hardening such as Secure Boot.


What do IT administrator skills mean now?

The role of the IT administrator will definitely need to change as data centers hybridize across multiple types of private and public clouds, stacks of infrastructure converge and hyper-converge, and systems management develops sentience. Of course, change is inevitable. But how can old-school IT administrators stay current and continue providing mastery-level value to their organizations? I'd recommend paying attention to current trends and emerging capabilities. Become an expert in how the organization can best use those trends. ... The future of IT is about creating higher-level value individually while leveraging core expertise widely -- developing the deepest insights, but sharing it as widely as needed to get an optimized return on the IT investment that businesses make.


IBM says: ‘Swift is now ready for the enterprise’

With Swift on the Cloud, enterprises will benefit from faster back-end API performance, safer and more reliable transaction and integration support, and the ability to re-purpose Swift developer skills on the client and server-side. This integration delivers tangible benefits to enterprise IT.City Furniture was building an app to handle clearance furniture. They had intended building their front end apps in Swift, but were able to work with early versions of the tools IBM introduced today to build the back end code in the same language. “They were able to build that in an incredibly short time, a few weeks,” he said. City Furniture is a perfect example of the kind of small, nimble development teams that will underpin the future of enterprise IT. “They had one developer and we helped them a bit. That one developer was also able to contribute to the project


9 Ways To Ensure Cloud Security

Whether you’ve migrated some or all of your infrastructure to the cloud, or are still considering the move, you should be thinking about security. Too often, organizations assume a certain level of protection from a cloud service provider and don’t take steps to ensure applications and data are just as safe as those housed in the data center. The sheer range of cloud technology has generated an array of new security challenges. From reconciling security policies across hybrid environments to keeping a wary eye on cloud co-tenants, there is no shortage of concerns. An increasingly complex attack landscape only complicates matters and requires security systems that are vigilant and able to adapt. Here are nine tips to consider before, during, and after a cloud migration to stay ahead of the curve when evaluating security solutions for your cloud service.


Cyber Security Threat Detection – The Case for Automation

The good news is that advances in threat detection technology have significantly improved the enterprise’s ability to detect and stop these threats and prevent extensive damage. The challenge, however, is that many of these technologies demand an army of human security analysts to interpret threat indicators and determine the appropriate course of action, including elimination and clean up. With hundreds, if not thousands, of varying levels of threat flags per day, this task is like holding back the tide; it is nearly impossible for security teams to keep up with the flow of information and still perform other ongoing responsibilities in prevention and analysis. Not surprisingly given their frequency, many of these alerts are often ignored.


Taking Risks To Manage Risk: The Life Of The Modern IT Security Executive

Risk isn’t something that many IT security professionals are comfortable with. After all, they’re often employed to reduce the risk of attacks on corporate IT. ... Doing things differently often comes with the risk of failure, which can have negative consequences to a company’s IT security. But the IT security space is dynamic; new technologies, solutions and strategies come out regularly and CISOs need to keep pace with these developments. “The biggest risk at the moment is doing nothing — you’re at risk of becoming irrelevant,” CSIRO CISO and lead architect Angus Vickery said at SINET61. “You have to do something to ensure you’re continually relevant because the horse will bolt without you anyway. “… Modern CISOs need to have an open mind.”


Security framework released for industrial Internet of Things

The security framework goes along with reference architecture, connectivity and other guides previously published by the consortium. This document separates security evaluation into endpoint, communications, monitoring and configuration building blocks, each with implementation best practices. It also breaks the industrial space down into three roles: Component builders (who build hardware and software), system builders (better known to readers here a system integrators) and operational users. To ensure end-to-end security, the consortium notes industrial users must assess the level of trustworthiness of a complete system. As for the future, the concluding note in the framework points out that as the sheer volume of data required for managing devices increases, there’s a point where centralized security management ceases to be effective and efficient.


Five Strategies For Creating a Culture of Data Security

When data protection is prioritized and done well, it provides more disciplined operations, increased customer and stakeholder trust, and minimized risk. One of the best ways to protect company information is to create a corporate culture that views information security as a shared responsibility among all employees. This can be done by implementing regular and comprehensive training programs for all employees on the right way to manage, store and destroy physical and digital data. ... Experts suggest that employees may forget 50 percent of training information within one hour of a presentation, 70 percent within 24 hours and an average of 90 percent within a week. When you consider this, it is clear that training once a year or on an ad-hoc basis is insufficient to ensure valuable customer, employee and business data is being protected.




Quote for the day:


"Relative to all the other risks companies face, the cyber risks often aren't as big a deal as we think. It may be bad for you if you are the victim, but it doesn't change the behavior or strategy of a company." -- Sasha Romanosky


September 21, 2016

Five Social Engineering Scams Employees Still Fall For

“Most people are not going to look really closely to know where that email came from, and they click on it and their machine may be taken over by somebody, or infected,” says Ronald Nutter, online security expert and author of The Hackers Are Coming, How to Safely Surf the Internet. “Especially when you’re exchanging files with subcontractors or partners on a project, you really should be using a secure file transfer system so you know where the file came from and that it’s been vetted.” He also cautions recipients to be wary of any file that asks the user to enable macros, which can lead to a system takeover.


How flexible should your infosec model be?

How often to adopt infosec policy changes is a conundrum. Companies need to come up with a way to remain flexible, to ensure that their policies and procedures reflect the current threat landscape, yet they can't hand down so many new rules and restrictions that they frustrate users and inadvertently compel them to consider bypassing corporate rules, explains Kelley Mak, an analyst at Forrester Research. At the same time, companies have to strike a balance between using firefighting tactics to address the most current threats and treating information security policy as a holistic strategy, Mak says. "It's not as simple as taking the data and making a new policy, because you have to make sure information workers aren't upset," he says. "The more restrictions you put in place, the more likely someone is to go around it."


Cybercrime Inc: How hacking gangs are modeling themselves on big business

Like the legitimate software market, cybercrime is now a huge economy in its own right, with people with a range of skillsets working together towards one goal: making money with illicit hacking schemes, malware, ransomware, and more. It's essentially an extension of 'real world' crime into cyberspace, and it's come a long way in recent years as groups have become bigger, more specialized, and more professional. "There's been a substantial amount of improvement and innovation in the way attackers go after networks and, as cybercrime has professionalized, you've seen individuals develop a particular set of skills which fit into a broader network," says Gleicher, now head of cybersecurity strategy at Illumio.


Picking up the pace: The intersection of strategy and agility

Organizational agility, not to be confused with the Agile methodology, is the ability to quickly identify and execute initiatives for opportunities and risks that align with overall strategy. This means that organizations have not only to stay aware of changes in their business environments, but also to be flexible enough to change direction and implement new initiatives quickly, both in order to avoid risks and to achieve competitive advantages. APQC and Strategic and Competitive Intelligence Professionals (SCIP) conducted a survey to look at organizational agility and understand what role strategy has in helping organizations be more agile. To that end, the survey investigated organizations’ agility, strategic planning, information assessment, and implementation practices.


Roundtable: What Experts Are Doing to Protect Against Ransomware

What’s different is that your user population needs to know what to do if a ransom message appears on their screen. Do they power off, disconnect from the network or do both? Your user community has to know exactly what to do. By the way, the right answer is to disconnect from the network and not power off—rely instead on whatever mechanism you have to trigger an incident response. Do not power off. So the users have to know that. Assuming that you have the basic hygiene—the incident response plans, the remediation, the patching, the hardening, the configurations—in place, then the only other additional consideration is that if you don’t have a fast, automatic way of detecting and responding to zero-day malware—either at the network level or at the end point level—you need to get one.


The Internet of Things, cyber-security and the role of the CIO

Basically we are inexperienced in creating large platforms with security in mind. This inexperience in deploying mass networks in a secure way could create a recipe for major breaches and security issues. The IoT is a relatively greenfield area in IT. It should offer the chance to design and architect solutions with security integrated right from the start, rather than an additional feature further down the road. Whilst CIOs need to be mindful of this issue for future planning, there is also the opportunity to make sure vendors are building this security into any IT expenditure that the organisation plans to make. Existing security controls may well be able to address these new concerns but they need to be implemented in an agile and effective way to enable them to adapt to the new attack vectors.


Navigating The Muddy Waters Of Enterprise Infosec

Many companies today hope to avoid similar high-profile wakeup calls. After years of news about disastrous breaches, information security has finally gotten the attention of upper management. Two-thirds of 287 U.S. respondents to a survey conducted by CSO, CIO and Computerworld said that senior business executives at their organizations are focusing more attention on infosec than they were in the past. And most of the respondents said they expect that focus to continue. Yet IT leaders still face challenges when it comes to aligning security goals with the needs of business, including justifying costs, defining risks, and clarifying roles and responsibilities.


ArtificiaI intelligence, APIs and the transformation of computer science

Like yesterday’s code libraries, you could try to build A.I. platforms yourself -- if you had a few years and a dozen data scientists to throw at the problem. Or you can access A.I. engines like IBM’s Watson or Google’s TensorFlow “as-a-service,” taking advantage of the planet’s most advanced, fundamental CS work via an API call. When one looks at the world of software in this way, the choice for most companies today is straightforward: spend years of effort and millions of dollars in expense duplicating extremely important -- but ultimately commodity, especially once it’s open-sourced -- computer science work, or instead focus on leveraging that work to develop and improve their own products and intellectual property. For most businesses, the choice is simple.


In a world of free operating systems, can Windows 10 survive?

We can all pretty much agree that Windows has some staying power. That said, when I asked our resident Windows soothsayer Ed Bott about actual numbers of users, he told me, "Given that PC sales are flat or down in recent years and are probably close to the replacement rate, it's likely that the very large Windows installed base is shrinking slowly." The operative word here isn't "shrinking," it's "slowly." There are millions of users out there who have good reason to stick with Windows. Many of them will continue using it because the learning curve for a different operating system is either too much work, or just simply unnecessary. Others will stay with it because Chromebooks, tablets, and other "appliance-like" machines just don't have enough power and flexibility.


How HR and IT departments can join forces to bolster security strategies

Working with IT, HR should establish processes to manage access rights to sensitive data – ensuring that appropriate controls are in place – and preventing employees from accessing data that they don’t need. HR can also support IT in identifying gaps in terms of departments or individuals, like contractors or temporary staff, with permissions that have not been withdrawn or privileges that may need to be re-defined. They can implement processes and technology for managing access rights and to ensure that these are regularly audited to close any security gaps.  Full co-operation between HR and IT is essential in projects of strategic importance such as IAM (Identity Access Management) deployments. This is a common pitfall, but without internal co-operation there can be misunderstandings, or at worst, projects can unravel entirely.



Quote for the day:


"Negativity will derail you from pursuing success, and like attracts like." -- Kathleen Elkins


September 20, 2016

What To Think About When Moving To The Cloud

In either a private or a public cloud, they need applications to behave a certain way. Unfortunately, it's not always possible to move legacy. A workaround that will require change over a long period, said Stern, is if they put what they can in their private or public cloud until they are able to examine which ones are worth rewriting.  Before making the move to the cloud, Alex Hamerstone, GRC practice lead at TrustedSec, said, "Settle on a definition of what the cloud is. It’s really just someone else’s computer. A computer that’s not yours. You should know why you are you moving to the cloud. What are the advantages? Is it cost or that it is easier to maintain?"


Crypto backdoors will be nailed shut

Snowden’s revelations about a backdoor have undermined trust in large amounts of U.S.-made infrastructure, and have had lasting impact. The good news is that new thinking and research about encryption is emerging, with new techniques that can nail shut any attempted backdoors.  Alex Russell is a professor at the University of Connecticut, and he has been focusing on the problem of how to ensure that a randomly generated number used to generate encryption keys is in fact random. Russell and his team have shown that by taking the output of the random number generator and running it through a hash function such as SHA-256 hash, a new and truly random number is created that can reliably be used to generate encryption keys. 


New path proposed for CPAs in cyber risk management

Evolution of technology and the sophistication of hackers have made cybersecurity one of the most important areas of risk management for businesses. More than 95% of CGMA designation holders participating in a 2015 survey said their companies are concerned with the threat of database breaches, distributed denial of service (DDoS) attacks, phishing scams, and other cyberattacks. ... The proposed frameworks represent an effort by the auditing profession and the AICPA to develop a common foundation for CPAs’ services in response to the growing market demand for information about the effectiveness of cybersecurity risk management programs. “Our primary objective is to propose a reporting framework through which organizations can communicate useful information regarding their cybersecurity risk management programs to stakeholders,” said Sue Coffey


Looking for data loss in all the wrong places

Part of the problem may be that loss prevention tools can’t stand up to new theft targets, the most common of which are personally identifiable information and protected health data. Organizations may also unwittingly alert hackers to their soft targets. Activities that publicize a new or improved service may signal that the service is not yet well secured, the report stated. Besides new projects or products, hackers look for reorganizations and strategic planning activities. This supports the report’s finding that organizations aren’t monitoring data movement in the right places. For instance, only 37 percent of DPB survey respondents said they use endpoint monitoring on physical media, despite the fact that 40 percent of data losses involve some type of physical media.


Banks Are Turning To The Blockchain In A Big Way

Even central banks across the globe have shown their intent to adopt the distributed ledger technology. The Dutch Central Bank has been actively experimenting blockchain and developing the DNBcoin. In July 2016, Bank of England issued a paper that discusses the ‘macroeconomics of central bank issued digital currencies’ which reveals some interesting findings. It says, “In a...model calibrated to match the pre-crisis United States, we find that CBDC issuance of 30% of GDP, against government bonds, could permanently raise GDP by as much as 3%, due to reductions in real interest rates, distorting taxes and monetary transaction costs.” Such findings showcase the potential that the blockchain holds for the financial machinery of a country.


Startup Mentality: What Makes It Good or Bad?

The build – measure – learn loop suggests that speed is a critical factor during successful product development. First, you build the product, then you test its usability (gathering of user feedback and data), and lastly, you acquire critical insights. The insights you get will determine whether the product is ready for deployment, or it needs to be pivoted. The process is then repeated until it results in success. The lean startup method is a prime example of a good, productive mindset when approaching the problem of developing and marketing a product. Instead of being limited to your own company’s predictions and focusing only on early consumer adoption, the continuous monitoring of a product’s usability and tweaking of its features to suit the users’ needs will ensure success upon completion.


Firmware - The New Cybersecurity Target

There are a few fundamental reasons why firmware can make a realistic target: No upgrade path for firmware: In contrast to software, firmware can be more difficult to update. Update policies may not exist; indeed, the ability to update may not even exist. Add to this the resiliency of these systems—literally devices that may sit around for decades. Changes in security requirements (e.g., updated encryption algorithms) may not be reflected in updated firmware. Even unsophisticated attack techniques are highly likely to work across outdated security mechanisms. Traditional methods don’t apply or can be side-stepped: No matter how many layers of security are built into the OS, ultimately a system relies on the underlying firmware to boot and interact with hardware.


Intel's New PC, IoT Chief Brings Fresh Ideas

Nobody inside Intel is coming anywhere near the kind-of-like fatalistic conclusions about where Moore's Law is. Intel has had a stellar track record in delivering node generation like clockwork. Maybe we've moved from a two-year to a two-and-a-half-year cadence, but we already see light at the end of the tunnel. We will continue to drive process technology and nobody is calling timeout on anything. We're working hard on 7-nanometer, we're talking about pathfinding for 5-nanometer. All of that is in the throes. We made a great announcement on Kaby Lake -- that's using an evolution of 14-nanometer transistor geometry that gave a substantially improved user experience compared to Skylake. We're going to continue to do more of that as we continue to drive process leadership.


Cyberspace, Terrorism and International Law

Counter-terrorism policies underscore the need for resilience—the ability to identify terrorist attacks, control the damage and recover. Cybersecurity policy also highlights resilience as critical, and resilience informs development of national computer incident or emergency response teams and cooperation among them. However, cooperation on cyber resilience before or after an incident happens without international legal obligations, which mirrors international law on terrorism. Apart from obligations on law enforcement cooperation, antiterrorism treaties do not include duties to provide assistance to parties attacked by terrorists. This state of affairs also reflects the lack of legal obligations on states to assist countries hit by natural disasters.


Ransomware's next target: Your car and your home

Unless there is clear separation between the engine control units and other systems, hackers could block out the entire car "so you're not even going to get out of your driveway unless you pay," says Samani. This could be a lucrative option for cybercriminals because, while people might be OK with losing some files if they don't pay the ransom, when it comes to a car, they're going to give in, he added. "Quite frankly, if you're sitting in your driveway in 2021 in a self-driving car, if you have to pay two Bitcoins to get to work, what are you going to do? Are you going to pay? Of course you will. If you've got a $60,000 connected car to drive you work and you're being charged $200 to move? You'll pay," he says.



Quote for the day:


"We don't have a choice on whether we do social media, the question is how well we do it." -- @equalman


September 19, 2016

Building the data foundations for smart cities

Emergency services will need to be alerted and receive information as quickly as possible in order to help people affected and deal with any potential escalation. In addition, traffic will need to be intelligently redirected to avoid the affected area; communications networks will need to be flexible enough to direct capacity to where it’s needed; and information will need to be collated to inform citizens about the incident. So it’s essential that smart cities make use of cutting edge data processing and analytics capabilities, and in particular, in-memory latency for external storage. Typically in-memory processing is expensive and limited by the amount of data that can processed.


Cyber-security VCs are holding onto their cash – but that’s OK

“VCs are holding out for companies that are merging to offer more unified-security platforms.” Furthermore, he continues that early-stage companies that were funded in 2015 have since slipped below expectations, with their products quickly shown to be copies, obsolete or simply with revenues that “were not up to expectations.” Jack Gold, principal analyst and founder at J. Gold Associates, agrees that VCs may have got swept away with market hype. “Here’s the problem…if I as a VC find a nice cool company with a new twist on security and I invest in them, there’s a chance I will find six other companies doing the same thing in the same marketplace. “There is an over-abundance of companies trying to get a different bite of the same security meal.”


How ITSM laid the foundation for a cultural transformation

Solving the service management conundrum has allowed Oshkosh to turn its attention to other pressing matters. Schecklman is currently trying to consolidate 15 disparate general ledger systems, including some 20-year-old Mapics and J.D. Edwards ERP software, and operate them under a financial shared services model. Oshkosh is also improving cybersecurity to protect the company's intellectual property, including details about such new machines as the Joint Light Tactical Vehicle, which the Army is using to replace its Humvees. Creating a layered strategy for defending data is crucial because interest in hacking Oshkosh broadens after it wins contracts in foreign countries worldwide, Schecklman says.


CIO Insights: Bring data out of the dark

In this episode of CIO Insights, Martin describes how businesses can use a hybrid analytics architecture to compete effectively, and we discuss how doing data analytics on the cloud is a much safer bet today than ever before. Also, tune in to hear Martin outline the things that CIOs should keep in mind during their transition to the cloud. Don’t miss other episodes in this series of C-level interviews. Also, be sure to set aside time to hear IBM CEO Ginny Rometti, journalist and author Thomas Friedman, IBM Chief Data Officer Inderpal Bhandari and other engaging speakers live at IBM Insight at World of Watson 2016. Martin has a Ph.D. and M.A. in economics from Tufts University and a B.S. cum laude in mathematics from University of Massachusetts Lowell.


The Cloud Is in for Bubble Trouble

Tech companies are beginning to position themselves in the IoT space, but the smart ones are doing so cautiously for one very important reason: the trend is extremely broad and Wall Street has yet to figure out who the winners will be. But that hasn’t stopped everyone from GE and IBM to Cisco and Intel from jumping on the bandwagon. Intel CEO Brian Krzanich recently made a big splash about the chip giant’s evolution from a PC company to one that “powers the cloud and billions of smart, connected computing devices,” meaning IoT. Unfortunately, the chip giant is not nearly as evolved as its PR would seem to indicate. For one thing, Intel’s cloud computing platform is little more Xenon processors for servers. Nothing new there. And those billions of smart devices are powered mostly by ARM, which Softbank is acquiring for $32 billion, ...


Why augmented reality means augmented risk to networks

Three important factors to consider are your mobile device management (MDM) solution, since AR apps like Pokémon GO are focused on the smartphone market. Employee training and awareness is also crucial, since human error and carelessness is often a key vulnerability for cybercriminals to target. The third key factor in an AR risk mitigation strategy should be visibility of app traffic on your network. To protect against sensitive data being exposed, or malicious data being introduced, you need to ensure that you have comprehensive, real-time visibility into all your network traffic, all the time. A variety of tools and solutions exist that purport to offer such network visibility; what you are looking for is intelligent filtering and distribution, including across Layer 7 application flows and encrypted traffic, at line rate with zero loss of packets.


Why your next storage solution may depend on blockchain

Blockchain has become an increasingly hot topic in recent months due to its ability to deliver distributed security. Its impact, however, has mostly evaded storage. The developers at Storj, an open source object store similar to AWS S3 or Microsoft Azure Blob Storage, aim to change that. Storj (pronounced like "storage") hopes to make object storage easier to use through intuitive tooling and documentation, a modern API, and an open source, try-before-you-buy approach. But really, much of the magic derives from blockchain. Think of Storj as a distributed cloud storage network, suitable for static content today but with aims to expand far beyond this in the future. This blockchain-based decentralization allows developers to store data in a secure, performant, and inexpensive way, spreading it across many nodes.


Architecture Is About Tradeoffs

First, any time you adopt a framework for building an application, you are inevitably going to spend time debugging and becoming expert in the framework. So while it is true that our team was able to focus primarily on just writing our business logic in the form of session beans and message-driven beans, it was also true that any time we had an edge case (long running processing, need for a sequence of events to occur in a certain order, complex data updates) we would run into issues with the way we were using the frameworks. This puts a strain on the subset of the team that is expert. Second, migrating to the new version of a Java Enterprise application server is made more complicated by the number of moving parts.


Unix tips: Making troubleshooting with lsof easier

Since lsof has such a huge collection of options, remembering which option to use for what sometimes makes the command hard to use as often or as effectively as you might like. So what we're doing today is looking at several ways to make the use of this very helpful tool a bit easier. We do that by creating useful aliases, by providing something of a "cheat sheet," and by deploying a number of lsof options in a script that makes educated guesses about what you're going after. Both of the aliases below will list whatever files are open on your behalf when you are logged in. I suspect that few sysadmins will want to type “showmyopenfiles.” It might be less of a problem to remember the lsof option or print out a cheat sheet. On the other hand, “showmine” would be somewhat ambiguous – my open files or my processes?


Master data management gains ground in UK public sector

“Where we had blockages in getting data out of old legacy systems that were a bit clunky, we just concentrated on another dataset,” says Farina. He adds that if users had tried the system with just some of the data, they might have dismissed it: “I’ve worked on projects before where it’s been done in that phased approach, and users can just lose interest in it.” Camden has found many uses for the system, with more than 300 approved staff in 35 teams having access. The first type of work was data intelligence, which allows the council to produce reports from multiple parts of the organisation with greater ease, such as a government social care return that includes information on education.



Quote for the day:


"No man will make a great leader who wants to do it all himself, or to get all the credit for doing it." -- Andrew Carnegie


September 18, 2016

Telecom APIs could increase African innovation

This has opened up a great avenue for developers to bring innovation into the hands of every African. However, this can only be achieved by companies opening up their APIs to the ecosystem. “Mobile operators already play a central role in nurturing the development of innovative solutions in Africa,” the GSMA report said. “They have traditionally supported various initiatives to identify and develop new talent and solutions, including incubators, accelerators and competitions, mostly through funding and mentorship.” The report also highlighted the interest that mobile firms have had in the tech industry. MTN, Millicom and Orange have acquired equity stakes in Africa Internet Group, the organisation that owns ecommerce giants Jumia and Kyamu.


How smart materials will literally reshape the world around us

Here’s how it would work: An airplane component (like the wing) is made out of a composite material that has been coated with a thin layer of nanosensors. This coating serves as a “nervous system,” allowing the component to “sense” everything that is happening around it — pressure, temperature and so on. When the wing’s nervous system senses damage, it sends a signal to microspheres of uncured material within the nanocrystal coating. This signal instructs the microspheres to release their contents in the damaged area and then start curing, much like putting glue on a crack and letting it harden. Airbus is already doing important research in this area at the University of Bristol’s National Composites Centre, moving us closer to an aviation industry shaped by smart materials.


What’s New in the Economics of Cybersecurity?

Policymakers’ choices can influence cybersecurity in various sectors. In “Economic Impacts of Rules- versus Risk-Based Cybersecurity Regulations for Critical Infrastructure Providers,” Fabio Massacci and his colleagues address the pressing issue of finding an optimal way to alert operators of critical infrastructures about cybersecurity risk. In particular, they compare the US’s rulebased model to the EU’s risk-based approach. A proposed cybersecurity model for public policy in the presence of strategic attackers is calibrated to the National Grid, which operates in the UK and the East Coast of the US. The model shows that, depending on the combination of incentives, operators will stop investing in risk assessment and care only about compliance, or vice versa. 

Now or never - India CEO Outlook 2016

Technology serves both as a trigger as well as an enabler of innovation. CEOs expect that over the next three years, technology is likely to have a huge impact on their growth, next only to global economic factors. They agree that almost every function of their businesses is bound to be influenced, with key focus areas for technology adoption in the near term likely to revolve around customer centricity, efficiency enhancement and employee satisfaction. ... Integration of basic automated business processes with artificial intelligence and cognitive processes remains an important concern for nearly 92 per cent of the surveyed CEOs. Key underlying causes could be the fact that planning for technology in many organisations takes place in silos, rather than at a unified organisational level, paired with a lack of ability to identify the right technology to meet organisational needs.


Bank of England wants next payment system to be blockchain-ready

It's not just about linking up with external blockchains, though: The bank will, it said, also continue to explore the use of distributed ledgers in its own systems, including through its own startup accelerator, which will shortly begin selecting a second round of participants. The first round includes a security assessment service, BitSight; a data anonymization tool, Privitar -- and a blockchain demonstration platform developed by the bank and PwC to explore the possibilities of smart contracts. "The resilience characteristics of the distributed ledger in particular are potentially highly attractive from a financial stability perspective," the bank noted. But it pretty much ruled out the possibility that the new RTGS would be blockchain-based.


Will Fog Computing Hide the Clouds of the Internet of Things?

The OpenFog Reference Architecture is an architectural evolution from traditional closed systems and the burgeoning cloud-only models to an approach that emphasizes computation nearest to the edge of the network when dictated by business concerns or critical application the functional requirements of the system. The OpenFog Reference Architecture consists of putting micro data centers or even small, purpose-built high-performance data analytics machines in remote offices and locations in order to gain real-time insights from the data collected, or to promote data thinning at the edge, by dramatically reducing the amount of data that needs to be transmitted to a central data center. Without having to move unnecessary data to a central data center, analytics at the edge can simplify and drastically speed analysis while also cutting costs.


The Top 5 Problems with Distributed Teams and How to Solve Them

It’s easy to put the problems in a list of 5. Putting the solutions in such list is a lot harder. Every setup, company and situation is different. I sometimes have people coming to my trainings expecting ‘cures’. The only thing I can deliver is a set of ideas, some of which may apply to their situation. It’s about turning a button here, tweaking a bit there and adjusting step by step. Agile, iterative. ... By creating a positive team spirit and addressing the cultural differences, we avoid the trap of ‘us versus them’ and we create awareness about how each team member behaves given the cultural context. We also define actions to organize around the differences and benefit from the similarities. Implementing a structure or tool to share the knowledge about the product or project we’re building, helps team members understand what they’re working on.


Singapore's cut-off from the internet is not so crazy

There are good reasons for Singapore's big disconnection, since Asian countries suffer a huge number of targeted attacks on their internet infrastructure. Those attacks are increasingly sophisticated in terms of both the technology employed and the psychological profiling of targets. In fact, Singapore's decision is more a question of philosophy than IT security. Actually, there are two questions: 1. Is it possible to completely secure a system that's connected to the internet?; 2. If not, what are the potential consequences if such a system is compromised? The answer to the first question is a resounding no. No operating system is exploit-free. The same applies for any mail client or web browser. Vulnerabilities may not be widely known yet, but they exist and will be discovered.


Bad migration experiences leave IT bosses gun-shy

IT shops don't think about the web of dependencies and connections between what they are upgrading and other systems, says Arnold. They don't look at storage subsystem compatibility, app dependency and true dependency of apps to servers. You might have to take down and replace several things all at the same time. But therein lays the challenge. Arnold spoke with one CIO who talked of wanting to solve the constant change in his environment but there were too many variables in these migrations to make a move. "There is no one particular issue because everyone has a different experience. I've had people plan these out and thought they knew how it would go and had a storage subsystem fail on them," he says.


This Is Why Securing Your Business Is More Important Than Ever

While attackers do continue to target large enterprises more frequently, small businesses are proving to be an emerging gold mine as they store the same valuable information, but have fewer resources to defend themselves against threats. In our most recent survey, we found that despite the majority of small businesses reporting being concerned about cyber attacks, a third were not taking any proactive measures at all to mitigate cyber risks, and only 12 percent had a breach preparedness plan in place. ... Awareness, education, monitoring and response, will all play a role in helping you safeguard your company information. There are a number of free, easily accessible resources, like the National Cyber Security Alliance and the Federal Communications Commission, for information on security best practices.



Quote for the day:


"Anticipation is the ultimate power. Losers react; leaders anticipate." -- Tony Robbins


September 17, 2016

What’s Wrong With Using Design Templates?

Contrary to templates which are designed for a broad business category, custom designs are built by experts to meet specific business needs. A great website is more than stunning visuals and smart widgets. Custom designs are personalized at a conceptual level, delivering great user experience on desktop or mobile, and engaging users to follow your calls-to-action. Founded on the business brand, custom web design allows more control over creative elements, helping businesses to forge a meaningful connection with their audiences. With custom design comes customized support of a designer able to perfectly align the look of the website with specific business needs. If you're opting for a custom design, here are some questions you should ask your web designer


How the startup world is bringing digital nomadism closer to reality

One of the trickiest things about travel is dealing with different time zones. WTB is a world clock converter and meeting scheduler that lets you schedule personal and professional events at a glance over multiple time zones. With a number of useful features, like Google Calendar integration, WTB is a great newer tool for working away from home. WTB has not disclosed funding information. According to reports, growth took a dive around November 2015, but shot back up in December 2015, and has maintained steady growth through 2016. As remote work continues to grow and more people refuse to compromise lifestyle for professional success, the digital nomad lifestyle is now even more attainable. If you’ve ever dreamed of leaving the cubicle behind and hitting the road, these startups provide the insider help you need to make it happen.


ViaWest: How Cloud Computing Alters Data Center Design

Now, certain well-known data center customers — Leonard cites Akamai as one example — are moving from a 12-15 kW per rack power usage profile down to about 9 kW/rack. Service providers are capable of making such deliberate changes to their applications to enable this kind of energy efficiency. Suppose a hypothetical SP customer of this same data center is inspired by Akamai, re-architects its application, and lowers its power consumption. “Well, now they can’t use the power that’s in that space,” argues Leonard. “Creating space where power and cooling are irretrievably tied to the floor space that is being delivered on is a really bad idea. When the use of that floor space, power, and cooling changes over time — and there’s a dozen dimensions that can cause it to change — those data centers are rigid and inflexible in their ability to react to those changes.”


Reasoning About Software Quality Attributes

Attribute primitives provide building blocks for constructing architectures. However, they are building blocks with a focus on achieving quality attribute goals such as performance, reliability and modifiability goals. Quality attribute design primitives will be codified in a manner that illustrates how they contribute to the achievement of quality attributes. Therefore each attribute primitive will be described not only in terms of their constituent components and connectors, but also in terms of the qualitative and/or quantitative models that can be used to argue how they affect quality attributes. Consider an example: the client/server attribute primitive. This is collaboration between the providers and users of set of services. The attribute primitive separates one collection of responsibilities (the client's) from another (the server's).


Code ownership and software quality

The owner of source code usually refers to the person who implemented the code. However, larger code artifacts, such as files, are usually composed by multiple engineers contributing to the entity over time through a series of changes. Frequently, the person with the highest contribution, in terms of lines written or code changes made, is defined as the code owner and takes responsibility for it. ... Weak ownership means distributing the responsibility for a particular part of software among multiple developers. We speculated that code without a primary owner might have no champion who will take responsibility to maintain and test the code. Without such code owners, knowledge about the inner working and functionality of code might be limited and once lost completely, it might take time to recover. Overtime, such files become more susceptible to bugs.


How to Keep Your Passwords, Financial & Personal Information Safe

Keeping your passwords, financial, and other personal information safe and protected from outside intruders has long been a priority of businesses, but it's increasingly critical for consumers and individuals to heed data protection advice and use sound practices to keep your sensitive personal information safe and secure. There's an abundance of information out there for consumers, families, and individuals on protecting passwords, adequately protecting desktop computers, laptops, and mobile devices from hackers, malware, and other threats, and best practices for using the Internet safely. But there's so much information that it's easy to get confused, particularly if you're not tech-savvy. We've compiled a list of 101 simple, straightforward best practices and tips for keeping your family's personal information private and protecting your devices from threats.


The end of password expiry

So, should we force users to change passwords, and if so, how often? It's not an easy question to answer and the industry seems divided in its opinion – for some, requiring people to change them often is bad, as it may encourage poor password choices and re-use of passwords on different sites, while others suggest it should be monthly or more for access to corporate applications and systems. In a recent Centrify survey, it was alarming to see how much password sharing between employees happens – often to enable a colleague to do work they can't usually do from their own account. Regular enforced password change would help ensure the person the password is being shared with would be unable to log in if they leave the company, albeit there would be a window of time when they still could.


Improving security, efficiency, and user experience in digital transformation

Humans can still be bugged or tricked into revealing their passwords. There is malware, or malicious software installed on computers; there is phishing, in which cyber crooks grab login, credit card, and other data in the guise of legitimate-seeming websites or apps; and there are even “zero day” attacks, in which hackers exploit overlooked software vulnerabilities. And of course, old-fashioned human attacks persist, including shoulder-surfing to observe users typing in their passwords, dumpster-diving to find discarded password information, impersonating authority figures to extract passwords from subordinates, discerning information about the individual from social media sources to change their password, and employees selling corporate passwords.


Enterprise-architecture – a changes report

In other words, at this stage – in the mid-2000s – all of this mainstream ‘enterprise’-architecture was still strictly IT-centric. For example, whilst the FEAF PRM briefly mentions ‘Human Capital’ and ‘Other Fixed Assets’, there’s nothing in that specification that actually describes them in any significant detail. For anything more than that, we had to turn to industry-specific frameworks such as eTOM or SCOR, or else roll our own. And where ‘enterprise’ is mentioned at all in the mainstream ‘EA’-frameworks, there’s also an assumption that ‘the organisation’ and ‘the enterprise’ are one and the same. By the time of the launch of TOGAF 9, in 2009, this becomes more overt: there’s an implication that whilst there must be some clients out there somewhere, they’re essentially ‘out-of-scope’ for the architecture, and that the maximum reach we’d need to worry about as enterprise-architects is an ‘inside-out’ view of the business-world


Dear CIO/CFO: “What is Enterprise Architecture?”

First, Enterprise Architects – real Enterprise Architects – are not “smarter” than everyone else. They do, however, have a very specific skill set and a level of experience of understanding that a systems view (People, Process, and Technology) is more important than a point solution perspective. I accept that what they produce and provide can be profoundly of value to an organization that wishes to understand itself and how the organization might improve “efficiencies”. They can provide an incredible amount of corporate intelligence to you, but this is nothing that a CIO or CFO should fear. Real Enterprise Architects are your personal “007”. Remember, James Bond still works under “M”, right? Enterprise Architecture is not, what many CIOs and CFOs seem to constantly tell me, about “pretty network diagrams”. No, that is “Network Architecture”.



Quote for the day:


"Having more data does not always give you the power to make better decisions." -- Jeffrey Fry