Showing posts with label IaaS. Show all posts
Showing posts with label IaaS. Show all posts

Daily Tech Digest - August 03, 2023

When your teammate is a machine: 8 questions CISOs should be asking about AI

There are many potential benefits that can flow from incorporating AI into security technology, according to Rebecca Herold, an IEEE member and founder of The Privacy Professor consultancy: streamlining work to shorten finish times for projects, the ability to make quick decisions, to find problems more expeditiously. But, she adds, there are a lot of half-baked instances being employed and buyers "end up diving into the deep end of the AI pool without doing one iota of scrutiny about whether or not the AI they view as the HAL 9000 savior of their business even works as promised." She also warns that when "flawed AI results go very wrong, causing privacy breaches, bias, security incidents, and noncompliance fines, those using the AI suddenly realize that this AI was more like the dark side of HAL 9000 than they had even considered as being a possibility." To avoid having your AI teammate tell you, "I'm sorry, Dave, I'm afraid I can't do that," when you are asking for results that are accurate, non-biased, privacy-protective, and in compliance with data protection requirements, Herold advises that every CISO ask eight questions


Generative AI needs humans in the loop for widespread adoption

Generative AI by itself has many positives, but it is currently a work in progress and it will need to work with humans for it to transform the world - which it is almost certain to do. This blending of man and machine is best described as “AI with humans in the loop” and it is already being widely adopted by businesses who want to cut operating costs and improve customer services, but also realise that humans will be crucial if these objectives are to be achieved. One of the sectors embracing this new normal is in financial journalism. Reuters managing director Sue Brooks announced that AI will be used to cover news stories and will create a “golden age” of news. Crucially, she also said it was vital there “was always a human in the loop to ensure total accuracy”. Reuters content now has automated time-coded transcripts and translation of many languages into English, part of the Reuters Connect service. Brooks went on to say that this meld would “free up brain power to be creative and put all these tools in your toolbox to create magical experiences for readers”.


AI chip adds artificial neurons to resistive RAM for use in wearables, drones

According to Weier Wan, a graduate researcher at Stanford University and one of the authors of the paper, published in Nature yesterday, NeuRRAM has been developed as an AI chip that greatly improves energy efficiency of AI inference, thereby allowing complex AI functions to be realized directly within battery-powered edge devices, such as smart wearables, drones, and industrial IoT sensors. "In today's AI chips, data processing and data storage happen in separate places – computing unit and memory unit. The frequent data movement between these units consumes the most energy and becomes the bottleneck for realizing low-power AI processors for edge devices," he said. To address this, the NeuRRAM chip implements a "compute-in-memory" model, where processing happens directly within memory. It also makes use of resistive RAM (RRAM), a memory type that is as fast as static RAM but is non-volatile, allowing it to store AI model weights. 


The CISO role has changed, and CISOs need to change with it

Perhaps the best way to improve security—and make the CISO’s job a little easier—is not reliant on technology. A change in culture is the best way to truly create an organization where security is top of mind. CISOs, part of upper management, but also part of the security team, are uniquely positioned to lead this change – both with other leaders and those they lead. A security-first culture requires embedding security in everything a business does. Developers should be enabled to create secure code that is free from vulnerabilities and resistant to attacks as soon as it is written, as opposed to being a consideration much later in the SDLC. Designated security champions from the developer ranks should lead this charge, acting as both coach and cheerleader. This approach means that security is not being mandated from above, but part of the team’s DNA and backed up by management. This cannot be an overnight change, and may be met with resistance. But the threat landscape is too complex, too advanced and too ubiquitous for any one person or even a small team to handle alone.


Hosting Provider Accused of Facilitating Nation-State Hacks

The allegations, whether true or not, are a reminder that cybercrime doesn't operate in a vacuum. Rather, there's a burgeoning service and support ecosystem. Services include initial access brokers who provide on-demand access to victims, botnet owners who facilitate malware-laden phishing attacks, and repacking services that make malware tougher to spot. They also include ransomware-as-a-service operators who lease their code to business partners, the affiliates who use it to infect victims, and cryptocurrency money laundering services that help criminals - operating online or off - convert their ill-gotten gains into cash. Online attackers require infrastructure for launching their attacks. Some make use of bulletproof service providers, which provide VPS and other types of hosting services in return for a promise, typically for a relatively high fee, that customers can do whatever they like. Halcyon's report alleges that Cloudzy functionally operates in a similar manner, due to a lack of proper oversight, including allowing cryptocurrency-using customers to be able to remain anonymous.


The tug-of-war between optimization and innovation in the CIO’s office

The downside of prioritizing optimization is the risk of overlooking opportunities for innovation that could have long-term impacts on the organization’s growth and relevance. Think game-changing new systems, such as AI, that increase supply chain efficiency, or automating steps in manufacturing that speeds up productivity and reduces costs at the same time. Usually, the value of a business is directly defined by the innovations that can drive it. Think about the services we use now, from food delivery to home sharing, with the draw being better customer experiences through innovation. Emphasizing innovation enables companies to stay ahead of the curve, attracting customers with cutting-edge products and services. ... These mistakes will kill a company. Taking resources away from innovation and spending them on making things work as they should removes business value. I think we’re going to see a great many businesses spend so much money to fix past mistakes that they’ll end up throwing in the towel. 


Flight to cloud drives IaaS networking adoption

IDC describes IaaS cloud networking as a foundational networking layer that allows large enterprises and technology providers to connect data centers, colocation environments, and cloud infrastructure. With IaaS networking, the network infrastructure and services are scalable and available on-demand, provisioned and consumed just like any other cloud service. That makes this infrastructure more scalable and agile than traditional approaches to networking, according to IDC. Direct cloud connects/interconnects is the largest segment of IaaS networking, accounting for more than half of all IaaS networking revenue. The four other major segments of the IaaS networking market are cloud WAN (transit), IaaS load balancing, IaaS service mesh, and cloud VPNs (to IaaS clouds), according to IDC. Cloud WAN, which includes cloud middle-mile and core transit networks, is the fastest-growing segment of IaaS networking, with a forecasted five-year compound annual growth rate of 112%, says IDC. IaaS service meshes are also expected to see strong growth, with a forecasted five-year compound annual growth rate of 68%.


The rise of Generative AI in software development

AI is accelerating the process of going from zero to one – it jumpstarts innovation, releasing developers from the need to start from scratch. But the 1 to n problem remains – they start faster but will quickly have to deal with issues like security, governance, code quality, and managing the entire application lifecycle. The largest cost of an application isn't creating it – it's maintaining it, adapting it, and ensuring it will last. And if organisations were already struggling with tech debt (code left behind by developers who quit, vendors who sunset apps and create monstrous workloads to take care of) now they'll also have to handle massive amounts of AI-generated code that their developers may or may not understand. As tempting as it may be for CIOs to assume they can train teams on how to prompt AI and use it to get any answers they need, it might be more efficient to invest in technologies that help you leverage Gen AI in ways that you can actually see, control and trust. This is why I believe that in the future, fundamentally, everything will be delivered on top of AI-powered low-code platforms. 


Will law firms fully embrace generative AI? The jury is out | The AI Beat

On one hand, gen AI is shaking up the legal industry, with companies like Everlaw adding options to their product portfolio, while Thomson Reuters can integrate with Microsoft 365 Copilot to power legal content generation directly in Word. On the other hand, lawyers tend to be a conservative bunch — and in this case, attorneys would likely be wise to be cautious, with headlines like “New York lawyers sanctioned for using fake ChatGPT cases in legal brief” going viral. Another problem is that their clients may not feel comfortable with law firms using gen AI — a new survey found that one-third of consumer respondents said they’re against any use of gen AI in the legal field. ... But with Everlaw’s new gen AI now available in beta, lawyers can go beyond just clustering data at the aggregate level to querying, summarizing and otherwise extracting details from documents to get what they need. For example, the company says that while it typically takes hours for a legal professional to compose a statement of facts, it can now happen in about 10 seconds, delivering legal teams a rough draft to edit and fact check. 


Vulnerability Management: Best Practices for Patching CVEs

In a perfect world, you would analyze all CVEs first to determine the priority order for patching. But this just isn’t scalable due to the sheer number of vulnerabilities and how frequently CVEs are discovered. In reality, only a handful of CVEs actually affect your software. Of course, there’s no way to know for certain how a CVE affects your application until it has been analyzed, but because there are so many, including those from transitive dependencies, it is nearly impossible to analyze them all before new CVEs are discovered or in the time between a tight release schedule. Instead, we recommend you start by patching all critical and high-severity CVEs without analysis. ... Preventing, detecting and patching CVEs needs to be a shared responsibility between developers and security teams. It is not sustainable for security teams to bear the responsibility of managing and patching CVEs alone. Development teams can often be hesitant to push frequent updates for fear that updates to software libraries will create bugs in their software.



Quote for the day:

"Our greatest battles are with our own minds." -- Jameson Frank

Daily Tech Digest - June 15, 2020

Can I read your mind? How close are we to mind-reading technologies?

Technology nowadays is already heavily progressing in artificial intelligence, so it doesn’t seem too farfetched. Humans have already developed brain-computer interface (BCI) technologies that can safely be used on humans. ... How would the government play a role in these mind-reading technologies? How would it effect the eligibility of use of the technology? Don’t you think some unethical play would be prevalent, because I sure do. I’m not very ethically inclined to believe these companies aren’t sending our data to other companies without our consent. I found this term “Neurorights” in a Vox article, “Brain-reading tech is coming. The law is not ready to protect us” written by Sigal Samuel. It’s a good read, and I think she demonstrates well into the depth of how this would impact society from a privacy concern standpoint. She discusses having 4 core new rights protected within the law: The right to your cognitive library, mental privacy, mental integrity, and psychological continuity. She mentions, “brain data is the ultimate refuge of privacy”. Once it’s collected, I believe you can’t get it back. There needs to be strict laws enforced if this were to become a ubiquitous technology.


It's The End Of Infrastructure-As-A-Service As We Know It: Here's What's Next

Containers are the next step in the abstraction trend. Multiple containers can run on a single OS kernel, which means they use resources more efficiently than VMs. In fact, on the infrastructure required for one VM, you could run a dozen containers. However, containers do have their downsides. While they're more space efficient than VMs, they still take up infrastructure capacity when idle, running up unnecessary costs. To reduce these costs to the absolute minimum, companies have another choice: Go serverless. The serverless model works best with event-driven applications — applications where a finite event, like a user accessing a web app, triggers the need for compute. With serverless, the company never has to pay for idle time, only for the milliseconds of compute time used in processing a request. This makes serverless very inexpensive when a company is getting started at a small volume while also reducing operational overhead as applications grow in scale. Transitioning to containerization or a serverless model requires major changes to your IT teams' processes and structure and thoughtful choices about how to carry out the transition itself.


9 Future of Work Trends Post-COVID-19

Before COVID-19, critical roles were viewed as roles with critical skills, or the capabilities an organization needed to meet its strategic goals. Now, employers are realizing that there is another category of critical roles — roles that are critical to the success of essential workflows. To build the workforce you’ll need post-pandemic, focus less on roles — which group unrelated skills — than on the skills needed to drive the organization’s competitive advantage and the workflows that fuel that advantage. Encourage employees to develop critical skills that potentially open up multiple opportunities for their career development, rather than preparing for a specific next role. Offer greater career development support to employees in critical roles who lack critical skills. ... After the global financial crisis, global M&A activity accelerated, and many companies were nationalized to avoid failure. As the pandemic subsides, there will be a similar acceleration of M&A and nationalization of companies. Companies will focus on expanding their geographic diversification and investment in secondary markets to mitigate and manage risk in times of disruption. This rise in complexity of size and organizational management will create challenges for leaders as operating models evolve.


South African bank to replace 12m cards after employees stole master key

"According to the report, it seems that corrupt employees have had access to the Host Master Key (HMK) or lower level keys," the security researcher behind Bank Security, a Twitter account dedicated to banking fraud, told ZDNet today in an interview. "The HMK is the key that protects all the keys, which, in a mainframe architecture, could access the ATM pins, home banking access codes, customer data, credit cards, etc.," the researcher told ZDNet. "Access to this type of data depends on the architecture, servers and database configurations. This key is then used by mainframes or servers that have access to the different internal applications and databases with stored customer data, as mentioned above. "The way in which this key and all the others lower-level keys are exchanged with third party systems has different implementations that vary from bank to bank," the researcher said. The Postbank incident is one of a kind as bank master keys are a bank's most sensitive secret and guarded accordingly, and are very rarely compromised, let alone outright stolen.


What matters most in an Agile organizational structure

An Agile organizational strategy that works for one organization won't necessarily work for another. The chapter excerpt includes a Spotify org chart, which the authors describe as, "Probably the most frequently emulated agile organizational model of all." But an Agile model that serves as a standard of success won't necessarily replicate to another organization well. Agile software developers aim to better meet customer needs. To do so, they need to prioritize, release and adapt software products more easily. Unlike the Spotify-inspired tribe structure, Agile teams should remain located closely to the operations teams that will ultimately support and scale their work, according to the authors. This model, they argue in Doing Agile Right, promotes accountability for change, and willingness to innovate on the business side. Any Agile initiative should follow the sequence of "test, learn, and scale." People at the top levels must accept new ideas, which will drive others to accept them as well. Then, innovation comes from the opposite direction. "Agile works best when decisions are pushed down the organization as far as possible, so long as people have appropriate guidelines and expectations about when to escalate a decision to a higher level."


What is process mining? Refining business processes with data analytics

Process mining is a methodology by which organizations collect data from existing systems to objectively visualize how business processes operate and how they can be improved. Analytical insights derived from process mining can help optimize digital transformation initiatives across the organization. In the past, process mining was most widely used in manufacturing to reduce errors and physical labor. Today, as companies increasingly adopt emerging automation and AI technologies, process mining has become a priority for organizations across every industry. Process mining is an important tool for organizations that are committed to continuously improving IT and business processes. Process mining begins by evaluating established IT or business processes to find repetitive tasks that can by automated using technologies such as robotic process automation (RPA), artificial intelligence and machine learning. By automating repetitive or mundane tasks, organizations can increase efficiency and productivity — and free up workers to spend more time on creative or complex projects. Automation also helps reduce inconsistencies and errors in process outcomes by minimizing variances. Once an IT or business process is developed, it’s important to consistently check back to ensure the process is delivering appropriate outcomes — and that’s where process mining comes in.


How to improve cybersecurity for artificial intelligence

One of the major security risks to AI systems is the potential for adversaries to compromise the integrity of their decision-making processes so that they do not make choices in the manner that their designers would expect or desire. One way to achieve this would be for adversaries to directly take control of an AI system so that they can decide what outputs the system generates and what decisions it makes. Alternatively, an attacker might try to influence those decisions more subtly and indirectly by delivering malicious inputs or training data to an AI model. For instance, an adversary who wants to compromise an autonomous vehicle so that it will be more likely to get into an accident might exploit vulnerabilities in the car’s software to make driving decisions themselves. However, remotely accessing and exploiting the software operating a vehicle could prove difficult, so instead an adversary might try to make the car ignore stop signs by defacing them in the area with graffiti. Therefore, the computer vision algorithm would not be able to recognize them as stop signs. This process by which adversaries can cause AI systems to make mistakes by manipulating inputs is called adversarial machine learning.


Using a DDD Approach for Validating Business Rules

For modeling commands that can be executed by clients, we need to identify them by assigning them names. For example, it can be something like MakeReservation. Notice that we are moving these design definitions towards a middle point between software design and business design. It may sound trivial, but when it’s specified, it helps us to understand a system design more efficiently. The idea connects with the HCI (human-computer interaction) concept of designing systems with a task in mind; the command helps designers to think about the specific task that the system needs to support. The command may have additional parameters, such as date, resource name, and description of the usage. ... Production rules are the heart of the system. So far, the command has traveled through different stages which should ensure that the provided request can be processed. Production rules specified the actions the system must perform to achieve the desired state. They deal with the task a client is trying to accomplish. Using the MakeReservation command as a reference, they make the necessary changes to register the requested resource as reserved.


7 Ways to Reduce Cloud Data Costs While Continuing to Innovate

This is a difficult time for enterprises, which need to tightly control costs amid the threat of a recession while still investing sufficiently in technology to remain competitive. ... This is especially true of analytics and machine learning projects. Data lakes, ideally suited for machine learning and streaming analytics, are a powerful way for businesses to develop new products and better serve their customers. But with data teams able to spin up new projects in the cloud easily, infrastructure must be managed closely to ensure every resource is optimized for cost and every dollar spent is justified. In the current economic climate, no business can tolerate waste. But enterprises aren’t powerless. Strong financial governance practices allow data teams to control and even reduce their cloud costs while still allowing innovation to happen. Creating appropriate guardrails that prevent teams from using more resources than they need and ensuring workloads are matched with the correct instance types to optimize savings will go a long way to reducing waste while ensuring that critical SLAs are met.


Who Should Lead AI Development: Data Scientists or Domain Experts?

To lead these efforts ethically and effectively, Chraibi suggested data scientists such as himself should be the driving force. “The data scientists will be able to give you an insight into how bad it will be using a machine-learning model” if ethical considerations are not taken into account, he said. But Paul Moxon, senior vice president for data architecture at Denodo Technologies, said his experience working with AI development in the financial sector has given him a different perspective. “The people who raised the ethics issues with banks—the original ones—were the legal and compliance team, not the technologists,” he said. “The technologists want to push the boundaries; they want to do what they’re really, really good at. But they don’t always think of the inadvertent consequences of what they’re doing.” In Moxon’s opinion, data scientists and other technology-focused roles should stay focused on the technology, while risk-centric roles like lawyers and compliance officers are better suited to considering broader, unintended effects. “Sometimes the data scientists don’t always have the vision into how something could be abused. Not how it should be used but how it could be abused,” he said.



Quote for the day:

"Only the disciplined ones in life are free. If you are undisciplined, you are a slave to your moods and your passions." -- Eliud Kipchoge

Daily Tech Digest - February 25, 2020

5G's impact: Advanced connectivity, but terrifying security concerns


Despite the enthusiasm, professionals are also concerned about some of the negative aspects of 5G, specifically security and cost. The top barriers to adopting 5G in the next three years included security concerns (35%) and upfront investment (31%), the report found. The relationship between 5G and security is complex. Overall, the majority of respondents (68%) do believe 5G will make their businesses more secure. However, security challenges are also inherent to the network infrastructure, according to the report. These concerns involve user privacy (41%), the number of connected devices (37%), service access (34%), and supply chain integrity (29%). On the connected devices front, some 74% of respondents said they are worried that having more connected devices will bring more avenues for data breaches. With that said, the same percentage of respondents understand that adopting 5G means they will need to redefine security policies and procedures. To prepare for both security and cost challenges associated with 5G, the report recommended users seek external help. The partners businesses will most likely work with include software and services companies (44%), cloud companies (43%), and equipment providers (31%). 


What if 5G fails? A preview of life after faith in technology


"If it gets to a point where it's a broad decoupling of the developed from the emerging economies," said Sec. Lew, "that's not good for anyone. The growth of emerging economies would not be very impressive if they didn't have very active, robust trading relationships with developed economies. And the costs in developed economies would go up considerably, which means that the impact on consumers would be quite dramatic." "We know, from the early days when there was CDMA and GSM," remarked Greg Guice, senior vice president at Washington, DC-based professional consultancy McGuireWoods, "that made it very difficult to sell equipment on a global basis. That not only hurt consumers, but it hurt the pace of technology." He continued: I think what the companies that are building the equipment, and seeking to deploy the equipment, are trying to figure out is, in a world where there may be fragmentation, how do we manage this? I don't see people Balkanizing into their own camps; I think everybody is trying to preserve, as best they can, international harmonization of a 5G platform. Those efforts are in earnest.


Greenpeace takes open-source approach to finish web transformation


“The vision is to help people take action on behalf of the planet,” said Laura Hilliger, a concept architect at Greenpeace who is a leading member of the Planet 4 project. “We want to provide a space that helps people understand how our ecological endeavours are successful, and to show that Greenpeace’s work is successful because of people working collectively.” She met Red Hat representatives after work was already underway on the project in May 2018, which culminated in consultants, technical architects and designers from the company coming in to do a “design sprint” with Greenpeace exactly a year later. This helped Red Hat better understand Planet 4 users and how they interact with the platform, as well as the challenges of integration and effectively visualising data. Hilliger said variations in the tech stacks deployed across Greenpeace’s 27 national and regional offices, on top of its 50-plus websites and platforms, had created a complex data landscape that made integrations difficult.


Evolution of the data fabric


Personally, the fabric concept also began to change my thinking when discussing infrastructure design, for too long it was focussed on technology, infrastructure and location, which would then be delivered to a business upon which they would place their data. However, the issue with this was the infrastructure could then limit how we used our data to solve business challenges. Data fabric changes that focus, building our strategy based on our data and how we need to use it, a focus on information and outcomes, not technology and location. Over time as our data strategies evolved with more focus on data and outcomes, it became clear that a consistent storage layer while a crucial part of a modern data platform design, does not in itself deliver all we need. A little while ago I wrote a series of articles about Building a Modern Data Platform which described how a platform is multi-layered, requiring not just consistent storage but also must be intelligent enough to understand our data as it is written and provide insight, apply security and do these things immediately across our enterprise.


Legal Tech May Face Explainability Hurdles Under New EU AI Proposals


Horrigan noted the transparency language in the European Commission’s proposal is similar to the transparency principles outlined in the EU’s General Data Protection Regulation (GDPR). While the European Commission is still drafting its AI regulations, legal tech companies have fallen under the scope of the GDPR since mid-2018. Legal tech companies have also fielded questions regarding predictive coding’s accuracy and transparency with technology-assisted review (TAR), Horrigan added. TAR has become increasingly accepted by courts after then-U.S. Magistrate Judge Andrew Peck of the Southern District of New York granted the first approval of TAR in 2012. In Peck’s order, he discussed predictive coding’s transparency that provides clarity regarding AI-powered software’s “black box.” “We’ve addressed the black box before with technology-assisted review and we will do it again with other forms of artificial intelligence. The black box issue can be overcome,” Horrigan said. However, Hudek disagreed. While Hudek said the proposed regulation doesn’t make him hesitant to develop new AI-powered features to his platform, it does make it more challenging.


Thinking About ‘Ethics’ in the Ethics of AI

Thinking_about_Ethics_in_the_Ethics_of_AI_Judith_Simon
Ethics by Design is “the technical/algorithmic integration of reasoning capabilities as part of the behavior of [autonomous AI]”. This line of research is also known as ‘machine ethics’. The aspiration of machine ethics is to build artificial moral agents, which are artificial agents with ethical capacities and thus can make ethical decisions without human intervention. Machine ethics thus answers the value alignment problem by building autonomous AI that by itself aligns with human values. To illustrate this perspective with the examples of AVs and hiring algorithms: researchers and developers would strive to create AVs that can reason about the ethically right decision and act accordingly in scenarios of unavoidable harm. Similarly, the hiring algorithms are supposed to make non-discriminatory decision without human intervention. Wendell Wallach and Colin Allen classified three types of approaches to machine ethics in their seminal book Moral machines.


Cisco goes to the cloud with broad enterprise security service

cloud security expert casb binary cloud computing cloud security by metamorworks getty
Cisco describes the new SecureX service as offering an open, cloud-native system that will let customers detect and remediate threats across Cisco and third-party products from a single interface. IT security teams can then automate and orchestrate security management across enterprise cloud, network and applications and end points. “Until now, security has largely been piecemeal with companies introducing new point products into their environments to address every new threat category that arises,” wrote Gee Rittenhouse senior vice president and general manager of Cisco’s Security Business Group in a blog about SecureX. “As a result, security teams that are already stretched thin have found themselves managing massive security infrastructures and pivoting between dozens of products that don’t work together and generate thousands of often conflicting alerts. In the absence of automation and staff, half of all legitimate alerts are not remediated.” Cisco pointed to its own 2020 CISO Benchmark Report, also released this week, as more evidence of the need for better, more tightly integrated security systems.


Evolution of Infrastructure as a Service


Some would say that IaaS, SaaS, and PaaS are part of a family tree. SaaS is one of the more widely known as-a-service models where cloud vendors host the business applications and then deliver to customers online. It enables customers to take advantage of the service without maintaining the infrastructure required to run software on-premises. In the SaaS model, customers pay for a specific number of licenses and the vendor manages the behind-the-scenes work. The PaaS model is more focused on application developers and providing them with a space to develop, run, and manage applications. PaaS models do not require developers to build additional networks, servers or storage as a starting point to developing their applications. ... IaaS is now enabling more disruption across all markets and industries as the same capabilities available to larger companies are now also available to the smallest startup in a garage. This includes advances in AI and Machine Learning (as a service), data analytics, serverless technologies, IoT and much more. This is also requiring large companies to behave as agile as a startup.


AI Regulation: Has the Time Arrived?


Karen Silverman, a partner at international business law firm Latham & Watkins noted that regulation risks include stifling beneficial innovation, the selection of business winners and losers without any basis, and making it more difficult for start-ups to achieve success. She added that ineffective, erratic, and uneven regulatory efforts or enforcement may also lead to unintended ethics issues. "There's some work [being done] on transparency and disclosure standards, but even that is complicated, and ... to get beyond broad principles, needs to be done on some more industry- or use-case specific basis," she said. "It’s probably easiest to start with regulations that take existing principles and read them onto new technologies, but this will leave the challenge of regulating the novel aspects of the tech, too." On the other hand, a well-designed regulatory scheme that zeros-in on bad actors and doesn't overregulate the technology would likely mark a positive change for AI and its supporters, Perry said.


Functional UI - a Model-Based Approach


User interfaces are reactive systems which are specified by the relation between the events received by the user interface application and the actions the application must undertake on the interfaced systems. Functional UI is a set of implementation techniques for user interface applications which emphasizes clear boundaries between the effectful and purely functional parts of an application. User interfaces' behavior can be modelized by state machines, that, on receiving events, transition between the different behavior modes of the interface. A state machine model can be visualized intuitively and economically in a way that is appealing to diverse constituencies (product owner, testers, developers), and surfaces design bugs earlier in the development process. Having a model of the user interface allows to auto-generate both the implementation and the tests for the user interface, leading to more resilient and reliable software. Property-based testing and metamorphic testing leverage the auto-generated test sequences to find bugs without having to define the complete and exact response of the user interface to a test sequence. Such testing techniques have found 100+ new bugs in two popular C compilers (GCC and LLVM)




Quote for the day:


"There is no 'one' way to be a perfect leader, but there are a million ways to be a good one." -- Mark W. Boyer


Daily Tech Digest - February 03, 2020

Why UK's Huawei decision leaves the fate of global 5G wireless in US hands

200130-5geo-07.jpg
"The UK has been doing business with Huawei for a long time through Openreach. They had been operating, with oversight, in the country for years," noted Doug Brake, who directs broadband and spectrum policy for Washington, DC-based Information Technology & Innovation Foundation. Openreach, to which Brake refers, is the division of top British telco BT responsible for deploying fiber optic infrastructure. It had been partnering mainly with Huawei until last November, when it began an evaluation process in search for additional partners. "So for the UK to come out and publicly brand them as a high-risk vendor, cordon them off to only 35 percent of the access network — not even let them into the core network," said Brake, "really puts Huawei in a tight box." For its part, Huawei did what it could Tuesday to thwart any possible interpretation of tightness, or a box. Omitting any mention of security or exploiting back doors in the infrastructure, Huawei Vice President Victor Zhang issued a statement, reading in part: "This evidence-based decision will result in a more advanced, more secure, and more cost-effective telecoms infrastructure that is fit for the future..."



Lex: An Optimizing Compiler for Regular Expressions

This perhaps isn't the fastest C# NFA regex engine around yet, but it does support Unicode and lazy expressions, and is getting faster due to the optimizing compiler. A Pike Virtual Machine is a technique for running regular expressions that relies on input programs to dictate how to match. The VM is an interpreter that runs the bytecode that executes the matching operation. The bytecode itself is compiled from one or more regular expressions. Basically, a Pike VM is a little cooperatively scheduled concurrent VM to run that bytecode code. It has some cleverness in it to avoid backtracking. It's potentially extremely powerful, and very extensible but this one is still a baby and very much a work in progress. The VM itself is solid, but the regular expressions could use a little bit of shoring up, and it could use some more regex features, like anchors.


Google launches open-source security key project, OpenSK


FIDO is a standard for secure online access via a browser that goes beyond passwords. There are three modern flavours of it: Universal Second Factor (U2F), Universal Authentication Factor (UAF), and FIDO2. UAF handles biometric authentication, while U2F lets people authenticate themselves using hardware keys that you can plug into a USB port or tap on a reader. That works as an extra layer on top of your regular password. FIDO2 does away with passwords altogether while using a hardware key by using an authentication protocol called WebAuthn. This uses the digital token on your security key to log straight into a compatible online service. To date, Yubikey and Google have both been popular providers of FIDO-compatible keys, but they’ve done so using their own proprietary hardware and software. Google hopes that by releasing an open-source version of FIDO firmware, it will accelerate broader adoption of the standard. Google has designed the OpenSK firmware to work on a Nordic dongle, which is a small uncased board with a USB connector on it.


Early use of AI for finance focused on operations, analytics


Anecdotal evidence suggests AI excels at financial processes that involve repetitive operations on large volumes of data. "It will eliminate the need for people to do a lot of the boring, repetitive work that they're doing today," Kugel said. "It will make it possible for systems to wrap themselves around the habits and requirements of the user, as opposed to the user having to adapt how they work within the limitations of technology." Data quality will also improve and, with it, the quality of analytics as AI gets better at flagging errors for people to correct, Kugel said. AI is also helping with tedious accounts payable tasks, such as confirming that goods were received and that an invoice contains the right items, Tay said. Companies that use automated payments are deploying machine learning to scan payment patterns for deviations. "If the machine learning algorithm tells them that the probability of the goods having been received and everything being good with that specific invoice, they'll pay that immediately," Tay said.


SaaS, PaaS, IaaS: The differences between each and how to pick the right one

Businessman using mobile smartphone and connecting cloud computing service with icon customer network connection. Cloud device online storage. Cloud technology internet networking concept.
In theory, PaaS, IaaS and SaaS are designed to do two things: cut costs and free organizations from the time and expense of purchasing equipment and hosting everything on-premises, DiDio said. "However, cloud computing services are not a panacea. Corporate enterprises can't just hand everything off to a third-party cloud provider and forget about them. There's too much at stake." Internal IT departments must remember what DiDio calls "the three "Cs: communication, collaboration and cooperation,'' which she said are all essential for successful business outcomes and uninterrupted smooth, efficient daily operational transactions. "When properly deployed and maintained, IaaS is highly flexible and scalable,'' DiDio said. "It's easily accessed by multiple users. And it's cost effective." IaaS is beneficial to businesses of all types and sizes, she said. "It provides complete and discretionary control over infrastructure… Many organizations find that they can slash their hardware costs by 50% or more using IaaS." However, IaaS "requires a mature operations model and rigorous security stacks including understanding cloud provider technologies,'' noted Vasudevan. IaaS also "requires skill and competency in resource management."


Startup uses machine learning to support GDPR’s right to be forgotten

“Every user has over 350 companies holding sensitive data on them, which is quite shocking,” says Ringel. “Not only that, but this number is growing by eight new companies a month, which means our personal footprint is highly dynamic and changing all the time.” According to Ringal, the conversation about data privacy needs to focus much more on data ownership. “Privacy is all about putting fences around us, preventing our personal information being shared with other people,” he says. “But the problem with that is that we miss out on the fun – every day we use online services and share our data with companies because it is convenient and efficient. Now, with GDPR, we can actually take our data back whenever we choose.” Once users know where their data is, Mine helps them reclaim it by submitting automated right-to-be-forgotten requests to the companies with the click of a button. For users on the trial version of Mine, the startup will email the request to the company and copy the user in to follow up communications.


Serverless Cloud Computing Will Drive Explosive Growth In AI-Based Innovation

Photo:
As cloud computing has advanced, more companies have made the transition to the cloud-based platform as a service model (PAAS), which delivers computing and software tools over the internet. PaaS can be scaled up or down as needed, which reduces up-front costs and allows you to focus on developing software applications instead of dealing with hardware oriented tasks. To support this shift toward the PaaS cloud, public cloud companies have begun heavily investing in building or acquiring serverless components that have pre-built unit functionality. These out-of-the-box tools allow organizations to test new concepts, iterate and evaluate without taking on high risk or expense. In the past, only large companies with considerable resources could afford to experiment with AI-based innovation. Now startups or small teams within larger enterprises have access to cloud-based, prepackaged algorithms offering different AI models that can fast-track innovation.  Let’s explore practical examples of how this trend helps democratize innovation in artificial intelligence by minimizing the time, money and resources needed to get started.


The Past, Present And Future Of Oracle’s Multi-Billion Dollar Cloud Bet

Larry had more confidence than I did. He was sure of it. I was more cautiously optimistic. We started running our little business on QuickBooks because we hadn’t built our system yet. When our system got to the point where we could run our own business’ business on it, I imported our QuickBooks file and saw our business in a browser at home. I was at home looking at all the key metrics of how we were spending, and how we were growing, and who our employees were, all there in the browser. That’s when I was sure it was going to work because I knew we were first to do that. I felt that with Larry’s strong backing we’d be able to reach a lot of companies, and that’s what happened. He was sure from the very beginning. It really was his idea to do it as a web-based application. He was the pioneer, and this was before Salesforce.com started, which he was also involved with. He wanted to do accounting, and I encouraged us to move beyond just accounting, and together we came up with this concept of the suite, and thus the name of the company, ultimately, became NetSuite.


Rogue IoT devices are putting your network at risk from hackers


Security standards for IoT devices aren't as stringent as they are for other products such as smartphones or laptops, so in many cases, it's been known for IoT manufacturers to ship highly insecure devices – and sometimes these products never receive any sort of patch either because the user isn't aware of how to apply it, or the company never issues one. A large number of connected devices are also easily discoverable with the aid of IoT search engine Shodan. Not only does this leave IoT products potentially vulnerable to being compromised and roped into a botnet, insecure IoT devices connected corporate networks could enable attackers to use something as trivial as a fitness tracker or a smart watch as an entry point into the network, and use it as means of further compromise. "Personal IoT devices are easily discoverable by cybercriminals, presenting a weak entry point into the network and posing a serious security risk to the organisation. Without a full view of the security policies of the devices connected to their network, IT teams are fighting a losing battle to keep the ever-expanding network perimeter safe," said Malcolm Murphy, Technical Director for EMEA at Infoblox.


Europe’s new API rules lay groundwork for regulating open banking


The EU and the U.K. have both passed laws that explicitly require their banks to create application programming interfaces and open those APIs to third-party developers. And banks in the U.S. should take notice. These new laws are paving the way to standardization for open banking which could lead to rapid innovation and a competitive advantage for the European banking system. These new laws are also more friendly to fintech companies as it streamlines access to a growing network of bank data. Fintechs within the U.S. must create individual data sharing agreements with each bank partner, and the negotiations for each partnership can be resource intensive. However, in the EU a fintech can get access to all bank APIs through registering as an account information service provider (AISP) or payment initiation service provider (PISP). This could create a situation where the U.S. may lose out on technology investments and see innovative financial professionals leave the nation to work in the rapidly advancing open-banking environment within the EU.



Quote for the day:


"The ability to summon positive emotions during periods of intense stress lies at the heart of effective leadership." -- Jim Loehr


Daily Tech Digest - January 17, 2020

Dell Optiplex 7070 Ultra: Modularity at a price


The main trick with the Optiplex 7070 Ultra, and the reason it is designed as a thin brick, is that it fits in a specially designed monitor stand that attaches to Dell monitors. This feature is touted as being a desktop space saver, which it certainly is, but do not think that it is a cableless affair. We tested this Optiplex with a Dell UltraSharp 24 USB-C monitor -- which is a serviceable, thin-bezel 1920x1080 monitor that retails for AU$340, and if it had a high resolution, it would be outstanding -- and found the Optiplex to be a half-way house between a regular desktop and an all-in-one. For instance, a USB-C cable was still needed to make the connection between the unit and the monitor, both devices needed their own power cables and bricks, and connecting headphones meant reaching behind the monitor to find the audio jack and hoping they lack enough lead to allow you to relax in your seat. Consolidating things like power connections would put it much closer to the realm of an all-in-one, while probably making it increasingly complex, but simple changes like adding reachable ports and audio jacks into the stand to face the user would help with everyday usability.



Silicone’s Final Days? An Exclusive Chat With Nobel Prize Winner Sir Konstantin Novoselov

Novoselov, who grew up in a very heavy engineering environment, adds that the Nobel has opened opportunities in terms of collaboration within the industry itself and has “promoted huge interest”. “As we see now that interest paid back in terms of creation of new applications.” Today, graphene powers many disruptive technologies and holds the potential to open up many more new markets, particularly next-generation electronics: faster transistors, semiconductors, bendable phones, to name a few. But what is graphene, you ask? Graphene was originally observed in electron microscopes in 1958 and as Novoselov explains, it’s both an interesting and very simple material. “It’s only carbon atoms,” he explains. “Carbon is one of the lightest, and one of the simplest atoms you can think about.” Graphene is to date, the strongest and thinnest material known to science. In fact, it is 100 times stronger than steel despite its almost 100% transparency and flexibility. The material has also proved to be a good thermal and electrical conductor, also known to have unique quantum properties.


Scottish police roll out controversial data extraction technology


“We’re committed to providing the best possible service to victims and witnesses of crime. This means we must keep pace with society. People of all ages now lead a significant part of their lives online and this is reflected in how we investigate crime and the evidence we present to courts,” said deputy chief constable Malcolm Graham. He added that digital devices are increasingly involved in investigations, placing ever higher demand on digital forensic examination teams. “Current limitations, however, mean the devices of victims, witnesses and suspects can be taken for months at a time, even if it later transpires that there is no worthwhile evidence on them,” said Graham. “By quickly identifying devices which do and do not contain evidence, we can minimise the intrusion on people’s lives and provide a better service to the public.”


How to protect your organization and employees from conversation hijacking

Internet security and data protection concept, blockchain.
Cybercriminals use a variety of tricks to try to convince unsuspecting users to reveal sensitive and valuable information. Phishing is a well-known and general method. A more specific and direct technique gaining traction is conversation hijacking. By impersonating employees or other trusted individuals and inserting themselves in a message thread, criminals try to obtain money or financial information. But there are ways to protect your company and employees from this type of attack, according to a new report from Barracuda Networks. Here's how the process typically works, according to Barracuda. Cybercriminals start by impersonating an organization's domain. Through domain impersonation or spoofing, attackers send emails to employees with phony domain names that appear legitimate or create websites with altered names. Phony domain names can be concocted and registered by slightly adjusting certain characters in the actual name or changing the Top-Level-Domain (TLD), for example, replacing .com with .net.


Network automation with Python, Paramiko, Netmiko and NAPALM


Network automation with Python and automation libraries can enable simplified communication with network devices. In this article, we take a look at three network automation libraries: Paramiko, Netmiko and NAPALM, or Network Automation Programmability Abstraction Layer with Multivendor support. Each library builds on its predecessor to provide greater layers of abstraction that enable users to build more efficient automation systems. Paramiko is a low-level Secure Shell (SSH) client library. We can use it to programmatically control connecting to a network device's command-line interface (CLI) over a secure SSH connection. With the library, users send commands a person would normally type and parse the results of each command's execution, also known as screen scraping. The Python script below uses the Paramiko library to query a Cisco Catalyst 3560 router for its Address Resolution Protocol (ARP) table. It is the first step of a script to identify the switch port where a device is connected.


Artificial Intelligence System Learns the Fundamental Laws of Quantum Mechanics

Artificial Intelligence Quantum Mechanics
In Chemistry, AI has become instrumental in predicting the outcomes of experiments or simulations of quantum systems. To achieve this, AI needs to be able to systematically incorporate the fundamental laws of physics. An interdisciplinary team of chemists, physicists, and computer scientists led by the University of Warwick, and including the Technical University of Berlin, and the University of Luxembourg have developed a deep machine learning algorithm that can predict the quantum states of molecules, so-called wave functions, which determine all properties of molecules. The AI achieves this by learning to solve fundamental equations of quantum mechanics as shown in their paper ‘Unifying machine learning and quantum chemistry with a deep neural network for molecular wavefunctions’ published in Nature Communications. Solving these equations in the conventional way requires massive high-performance computing resources (months of computing time) which is typically the bottleneck to the computational design of new purpose-built molecules for medical and industrial applications.


California’s IoT cybersecurity bill: What it gets right and wrong

California's IoT cybersecurity bill
The most significant issue to be addressed is the law’s ambiguity: it requires all connected devices to have “a reasonable security feature” (appropriate to the nature of the device and the information it collects) that is designed to protect the user’s data from unauthorized access, modification, or disclosure. Beyond that vague prescription, the law only specifically states that each connected device must also come with a unique hard-wired password, or it must otherwise require a user to set their own unique password before using the device. Some experts maintain that meeting the password requirements is all that’s needed to satisfy the regulation; in effect, the password is the “reasonable security feature.” If this interpretation is validated, it’s wholly insufficient for securing the IoT – especially for those connected systems that reside in our appliances, vehicles, and municipal infrastructures.


Facial recognition is real-life ‘Black Mirror’ stuff, Ocasio-Cortez says

Because facial recognition is being used without our consent or knowledge, she suggested, we may be mistakenly accused of a crime and have no idea that the technology has been used as the basis for the accusation. That’s right, the AI Now Institute’s Whittaker said, and there’s evidence that the use of facial recognition is often not disclosed. That lack of disclosure is compounded by our “broken criminal justice system,” Ocasio-Cortez said, where people often aren’t allowed to access the evidence used against them. Case in point: the Willie Lynch case in Florida. A year ago, Lynch, from Jacksonville, Florida, asked to see photos of other potential suspects after being arrested for allegedly selling $50 worth of crack to undercover cops. The police search had relied on facial recognition: the cops had taken poor-quality photos of the drug dealer with a smartphone camera and then sent them to a facial recognition technology expert who matched them to Lynch.


Enterprises spend more on cloud IaaS than on-premises data-center gear

Google Stadia - Data Center
The major segments with the highest growth rates over the decade were virtualization software, Ethernet switches and network security. Server share of the total data center market remained steady, while storage share declined. "The decade has seen a dramatic increase in computer capabilities, increasingly sophisticated enterprise applications and an explosion in the amount of data being generated and processed, pointing to an ever-growing need for data center capacity," said John Dinsdale, chief analyst at Synergy Research Group, in a statement. However, more than half of the servers now being sold are going into cloud providers’ data centers and not those of enterprises, Dinsdale added. "Over the last ten years we have seen a remarkable transformation in the IT market. Enterprises are now spending almost $200 billion per year on buying or accessing data center facilities, but cloud providers have become the main beneficiaries of that spending."


Microsoft opens up Rust-inspired Project Verona programming language on GitHub


As Parkinson explained, Project Verona aims to help secure code in unsafe languages like C and C# that still exists in a lot of Microsoft's legacy code, which Microsoft can't afford to waste but would like to protect better. "We're going to run some C and C++, stuff we don't trust," Parkinson said at the talk. "We're going to put it in a box and we know there is this region of objects, we have to be very careful with it, but there's a group of things going on there and we can built some pervasive sandboxing there. So there can be sandboxed libraries that we can embed in our sandboxed Verona program." The GitHub page for Project Verona outlines some of the high-level questions the group is working on that will be fleshed out in forthcoming peer-reviewed articles. ... "Project Verona is a research project that is not affecting engineering choices in the company," it states. "The Project Verona team is connected to the people using all the major languages at the company, and want to learn from their experience, so we can research the problems that matter."



Quote for the day:


"Real leadership is being the person others will gladly and confidently follow." -- John C. Maxwell


Daily Tech Digest - October 22, 2019

Agile Development: How to Pick the Most Valuable User Stories

A goal that's big enough to be worthwhile will usually have multiple actors involved -- these correspond to the roles of the standard user story template. In a retail environment, improving customer loyalty would involve not only the customer but also the parts of the company that the customer interacts with (shipping, ordering, marketing). This is the second level of the hierarchy: the actors involved in achieving the goal. For any actor, there are probably multiple "impacts" that need to be achieved. For example, if we want to improve customer loyalty, then we want customers to be more satisfied with us, to order more frequently from us and to buy more stuff when they do order from us. These impacts form the third level of an impact map and correspond to both the needs and reasons portions of the standard user template. This means that an impact isn't a deliverable: "Improving shipping" isn't an impact; instead "improving shipping" is a deliverable that might contribute to achieving that "improved customer satisfaction" impact. A good impact also has a measure associated with it that allows the organization to tell when it's been achieved.


The Data Breach Game: The 9 Worst IT Security Practices


Do you have one service account for all of your production servers? Or worse -- I saw this at a client once -- do you have linked servers between all of your database servers, and have those accounts logging in as the system admin? To take this a step further, in your personal life, do you reuse passwords across Web sites? It's not good to do that, even at sites that don't impact your finances. Those small sites are the most likely to get breached, and then the password you used at your favorite cat-grooming message board and your bank is now out there on the dark Web. ... As always, be careful what you click on, especially in e-mail. E-mail is one of our biggest productivity tools, but it's also the biggest security vulnerability in any organization. To be helpful, use a managed e-mail service like Office 365 or commercial Gmail and multifactor authentication (MFA). Bonus points if you don't use text messages for MFA. Finally, make sure that your network is segmented in a way that your CEO opening an e-mail can't infect your domain controllers or database servers.


New mainframe uses: Blockchain, containerized apps

mainframe servers in the cloud
Forrester's research found mainframes continue to be considered a critical piece of infrastructure for the modern business – and not solely to run old technologies. Of course, traditional enterprise applications and workloads remain firmly on the mainframe, with 48% of ERP apps, 45% of finance and accounting apps, 44% of HR management apps, and 43% of ECM apps staying on mainframes. But that's not all. Among survey respondents, 25% said that mobile sites and applications were being put into the mainframe, and 27% said they're running new blockchain initiatives and containerized applications. Blockchain and containerized applications benefit from the integrated security and massive parallelization inherent in a mainframe, Forrester said in its report. "We believe this research challenges popular opinion that mainframe is for legacy," said Brian Klingbeil, executive vice president of technology and strategy at Ensono, in a statement. "Mainframe modernization is giving enterprises not only the ability to continue to run their legacy applications, but also allows them to embrace new technologies such as containerized microservices, blockchain and mobile applications."


Sodinokibi Ransomware Gang Appears to Be Making a Killing

Sodinokibi Ransomware Gang Appears to Be Making a Killing
The group behind Sodinokibi appears to have had a head start on its success. While it's not clear what relationship the GandCrab and Sodinokibi gangs might have, researchers report seeing a clear code overlap in their malware. Security firm Secureworks says that based on multiple clues it believes that the threat groups behind GandCrab and Sodinokibi - aka Sodin and REvil - "overlap or are linked." In other words, one or more developers may not have retired with GandCrab, but helped set up a new operation. Like GandCrab, a customized version of Sodinokibi gets supplied to each individual affiliate, who infects systems with the malware and then shares a cut of the proceeds with organizers. Some affiliates appear to be more technically skilled than others. Coveware, a Connecticut-based ransomware incident response firm, says that at least one affiliate group specializes in hacking IT service providers as well as managed security service providers. Doing so enables the affiliate to distribute the ransomware to hundreds or thousands of endpoints managed by the service provider.


IaaS vs. PaaS options on AWS, Azure and Google Cloud Platform


Many early PaaS providers restricted which technologies they supported, and their software tools were compatible only with their own hosting platforms. It was difficult to migrate from one PaaS offering to another, or adapt a PaaS-based development pipeline to run on a generic IaaS instead. As businesses increasingly sought freedom from cloud lock-in, PaaS became more software-agnostic. Open source options, such as Docker containers orchestrated by Kubernetes, replaced some proprietary tooling. As a result, cloud computing vendors that originally specialized in IaaS added PaaS offerings, and increased compatibility with their respective IaaS offerings. For example, some versions of AWS CodePipeline, a continuous delivery service that forms part of a PaaS framework in the AWS cloud, can deploy applications to virtual machines or containers that run on AWS' IaaS.


Top cloud security controls you should be using

Gears in the form of a cloud in a binary field  >  Cloud controls
The misconfigured WAF was apparently permitted to list all the files in any AWS data buckets and read the contents of each file. The misconfiguration allowed the intruder to trick the firewall into relaying requests to a key back-end resource on AWS, according to the Krebs On Security blog. The resource “is responsible for handing out temporary information to a cloud server, including current credentials sent from a security service to access any resource in the cloud to which that server has access,” the blog explained. The breach impacted about 100 million US citizens, with about 140,000 Social Security numbers and 80,000 bank account numbers compromised, and eventually could cost Capital One up to $150 million. ... “The challenge exists not in the security of the cloud itself, but in the policies and technologies for security and control of the technology,” according to Gartner. “In nearly all cases, it is the user, not the cloud provider, who fails to manage the controls used to protect an organization’s data,” adding that “CIOs must change their line of questioning from ‘Is the cloud secure?’ to ‘Am I using the cloud securely?’”


Smart Cities are Made Smart by Planning and Strategy

Smart Cities are Made Smart by Planning and Strategy
Sleman Saliba underscored that the real “smart” piece of smart cities comes from the data and insights that come from making those connections. “The important stuff is not only on one premise but really interconnections between companies, between cities, between buildings, and leveraging this information that you get by using technologies across companies and factors,” he adds. Markus John provided a great real-world example of where this approach is making a difference, citing the work of city officials in Mannheim, Germany to take 180 hectares of prime land in the city (made available by the 2013 departure of the US Army) and use it as a boost to accelerating its development as a Smart City. He said that communities making this kind of a change have a unique opportunity, both in the center of the city, and – in particular - its suburbs. “Communities are thinking about how can we manage it smartly - in a modern way?,” he said. “It’s not just about delivering power from outside, but about how to make this suburb intelligent, in the way of buffering energy with batteries, maybe solar panels on the roof, optimizing energy consumption in the suburbs. ...”


Why compliance concerns are pushing more big companies to the cloud

In the current climate and looking into the future, we are seeing an acceleration of workloads from on-prem(ises) infrastructure, from on-prem applications into the cloud. That trend is clearly established. I don't think that's going to change, it's only accelerating. But one of the things which might appear a little bit counterintuitive, is as the adoption curve, the classical bell curve, that you see from early adopters to mainstream before it starts flagging down, we're seeing it's gone past the early adopters, it's more mainstream. But one of the interesting trends that I'm seeing is two issues are popping up. One is why there's an inexorable push to move the workload to the cloud, your data to the cloud, your applications to the cloud, for all the reasons why the cloud is becoming popular. Nobody wants to manage hardware, maintenance, capital efficiency, CAPEX v. OPEX transformation. But along with that, there's literally, I would say linear, almost an exponential concern around data security, data privacy, and regulatory compliance.


An indication of this trend is that European startups are struggling to transform into $1 billion-valued unicorn companies. While a few exceptions share the spotlight, overall this upscaling happens at only half the rate seen in the US. In addition to lack of funding, this also comes down to the fact that Europe is a made up of many distinct countries and that despite its efforts to unite around a single market, fragmentation is still part of its identity. In this context, companies face the challenge of scaling up across a continent that has various different national regulations and structures – a task much more complex than in large homogenous markets like China or the US.  And so, in the past 20 years, Europe's share of "superstar companies" – the top 10% of companies with more than $1 billion in annual revenue – has all but halved. But while the effects of fragmentation are evident, Hjartar explained, Europe could leverage its national differences to use them as a strength. "We have pockets of leaders spread out across the continent," he said. "We have 5.7 million software developers – that compares to 4.4 million in the US. We have all the building blocks to be successful, but now the biggest hurdle is to link them together with an ambitious vision."


The answer is, you can’t — at least, not all at once. It’s next to impossible to pull a giant organization, with hundreds of ingrained processes, divisions, and stakeholders, into a new digital and competitive landscape in one go. Many companies facing this challenge today invest deeply in research and development, thinking that knowledge alone will help right their course. But there’s no correlation between a company’s increased spending in research and development and a stellar performance. It’s not surprising, then, that you feel overwhelmed. You feel you’re on the precipice of profound change, yet you simply can’t move your company in a way that will achieve your goals or realize your vision. The secret to overcoming these challenges lies in the humble compass. Invented in the second century BC, its principles of directional guidance can be used today to create a road map to the future, one that breaks down choices that are actionable and impactful and delivers tangible results. 



Quote for the day:


"A leader should demonstrate his thoughts and opinions through his actions, not through his words." -- Jack Weatherford