Showing posts with label tech education. Show all posts
Showing posts with label tech education. Show all posts

Daily Tech Digest - February 12, 2025


Quote for the day:

“If you don’t have a competitive advantage, don’t compete.” -- Jack Welch


Security Is Blocking AI Adoption: Is BYOC the Answer?

Enterprises face unique hurdles in adopting AI at scale. Sensitive data must remain within secure, controlled environments, avoiding public networks or shared infrastructures. Traditional SaaS models often fail to meet these stringent data sovereignty and compliance demands. Beyond this, organizations require granular control, comprehensive auditing and full transparency to trace every AI decision and data access. This ensures vendors cannot interact with sensitive data without explicit approval and documentation. These unmet needs create a significant gap, preventing regulated industries from deploying AI solutions while maintaining compliance and security. ... The concept of Bring Your Own Cloud (BYOC) isn’t new. It emerged as a middle ground between traditional SaaS and on-premises deployments, promising to combine the best of both worlds: the convenience of managed services with the control and security of on-premises infrastructure. However, its history in the industry has been marked by both successes and cautionary tales. Early BYOC implementations often failed to live up to their promises. Some vendors merely deployed their software into customer cloud accounts without proper architectural planning, resulting in what was essentially remotely managed on-premises environments. 


The Importance of Continuing Education in Data and Tech

Continuing education plays a vital role in workforce development and career advancement within the tech industries, where rapid technological advancements and evolving market demands necessitate a culture of lifelong learning. As businesses increasingly rely on sophisticated data analytics, artificial intelligence (AI), and cloud technologies, professionals in these fields must continuously update their skills to remain competitive. Continuing education offers a pathway for individuals to acquire new capabilities, adapt to emerging technologies, and gain proficiency in specialized areas that are in high demand. By engaging in ongoing learning opportunities, tech professionals can enhance their expertise, making them more valuable to their current employers and more attractive to potential future ones. ... Professional certifications and competency-based education have become significant avenues for career advancement in the data and tech field. As the landscape of technology rapidly evolves, organizations increasingly seek professionals who possess validated skills and up-to-date knowledge. Professional certifications serve as tangible proof of one’s expertise in specific areas such as data governance, analytics, cybersecurity, or cloud computing. These certifications, offered by leading industry bodies and tech companies, are designed to align with current industry standards and demands.


Agents, shadow AI and AI factories: Making sense of it all in 2025

“Agentic AI” promises “digital agents” that learn from us, and can perceive, reason problems out in multiple steps and then make autonomous decisions on our behalf. They can solve multilayered questions that require them to interact with many other agents, formulate answers and take actions. Consider forecasting agents in the supply chain predicting customer needs by engaging customer service agents, and then proactively adjusting warehouse stock by engaging inventory agents. Every knowledge worker will find themselves gaining these superhuman capabilities backed by a team of domain-specific task agent workers helping them tackle large complex jobs with less expended effort. ... However, the proliferation of generative, and soon agentic AI, presents a growing problem for IT teams. Maybe you’re familiar with “shadow IT,” where individual departments or users procure their own resources, without IT knowing. In today’s world we have “shadow AI,” and it’s hitting businesses on two fronts. ... Today’s enterprises create value through insights and answers driven by intelligence, setting them apart from their competitors. Just as past industrial revolutions transformed industries — think about steam, electricity, internet and later computer software — the age of AI heralds a new era where the production of intelligence is the core engine of every business. 


Is VMware really becoming the new mainframe?

“CIOs can start to unwind their dependence on VMware,” he says. “But they need to know it may not have any material reduction in their spend with Broadcom over multiple renewals. They’re going to have to get completely off Broadcom.” Still, Warrilow recommends that CIOs running VMware consider alternatives over the long term. They should also look for exit strategies for other market-dominant IT products they use, given that Broadcom has seen early success with VMware, he says. “The cautionary tale for CIOs is that this is just the beginning,” he says. “Every tech investment firm is going to be saying, ‘I want what Broadcom has with their share price.’  ... “The comparison works a bit, maybe from a stickiness perspective, because customers have built their applications and workload using virtualization technology on VMware,” he says. “When they have to do a mass refactoring of applications, it’s very, very hard.” But the analogy has its limitations because many users think of mainframes as a legacy technology, while VMware’s cloud-based products address future challenges, he adds. “The cloud is the future for running your AI workload,” Shenoy says. “Customers have trusted us for the last 20 to 25 years to run their business-critical applications, and the interesting part right now is we are seeing a lot of growth of these AI workloads and container workloads running on VMware.”


Deep Learning – a Necessity

It is essential in architecture that we realize that a skill set is not an arbitrary thing. It isn’t learn one skill and you are done. It also isn’t learn any skill from any background and you’re in. It is the application of all of the identified and necessary skills combined that makes a distinguished architect. It is also important to understand the purpose and context of mastery. Working in a startup is very different from working in a large corporation. Industry can change things significantly as well. Always remember that the profession’s purpose has to be paramount in the learning. For example, both doctors and lawyers have to deal with clients and need human interaction skills to be successful. Yet, the nature and implementation of these differ drastically. We will explore this point in a further article. However, do not underestimate the impact of changing the meaning of the profession while claiming similar skills. The current environment is rife with this kind of co-opting of the terminology and tools to alter the whole purpose of architecture fundamentally. ... In medicine and other professions, an individual studies and practices for 7+ years to become fully independent, and they never stop learning. This learning is tracked by both mentors and the profession. Because medicine is so essential to humans it is important that professionals are measured and constantly update and hone their competencies.


Crawl, then walk, before you run with AI agents, experts recommend

The best bet for percolating AI agents throughout the organization is to keep things as simple as possible. "Companies and employees that have already found ways to operationalize intelligent agents for simple tasks are best placed to exploit the next wave with agentic AI," said Benjamin Lee, professor of computer and information science at the University of Pennsylvania. "These employees would already be engaging generative AI for simple tasks and they would be manually breaking complex tasks into simpler tasks for the AI. Such employees would already be seeing productivity gains from using generative AI for these simple tasks." Rowan agreed that enterprises should adopt a crawl, walk, run approach: "Begin with a pilot program to explore the potential of multiagent systems in a controlled, measurable environment." "Most people say AI is at the toddler stage, whereas agentic AI is like a tween," said Ben Sapp, global practice lead of intelligence at Digital.ai. "It's functional and knows how to execute certain functions." Enterprises and their technology teams "should socialize the use of generative AI for simple tasks within their organizations," Lee continued. "They should have strategies for breaking complex tasks into simpler ones so that, when intelligent agents become a reality, the sources of productivity gains are transparent, easily understood, and trusted."


Growth of digital wallet use shaking up payment regulations and benefits delivery

Australian banks are calling on the government to pass legislation that accommodates payments with digital wallets within the country’s regulatory framework. A release from the Australian Banking Association (ABA) argues that with the country’s residents making $20 billion worth of payments across 500 million transactions each month with mobile wallets, all players within the payment ecosystem should be under the remit of the Reserve Bank of Australia. ... Digital wallets are by far the most popular method of making cross-border payments, according to a new report from Payments Cards & Mobile. The How Digital Wallets Are Transforming Cross-Border Transactions report shows digital wallets are chosen for international transactions by 42.1 percent. That makes them more people than the next two most popular methods, money transfer services (16.8 percent) and bank accounts (14.8 percent) combined. Transactions with digital wallets are much faster than wire transfers, are available to people who don’t possess bank accounts, and have lower fees than bank transfers, the report says. Interoperability remains a challenge, and regulations and infrastructure limitations could pose barriers to adoption, but the report authors only expect the dominance of digital wallets to increase in the years ahead.


My vision is to create a digital twin of our entire operations, from design and manufacturing to products and customers

We approach this transformation from three dimensions. First is empathy – truly understanding not just who our customers are, but their emotions. This is where the concept of creating a ‘digital twin’ of the customer comes in. Second is innovation – not just adopting new technologies but ensuring that our processes are lean, digitised, and seamless throughout the customer journey, from research to purchase, service, and brand loyalty. The goal is to provide a consistent and empathetic experience across all touchpoints.  ... The first challenge is identifying our customers. For example, if a distributor in one business also buys from another or if a consumer connects with one of our industrial projects, it’s hard to track. To address this, we launched a customer UID project, which has been in progress for months. It helps us identify customers across channels while keeping an eye on privacy and adhering to upcoming data protection regulations. The second part involves gathering all customer-related data in one place. Over the past three years, we unified all customer interactions into a single platform with a one CRM strategy, which was complex but essential. Now, with AI solutions like social listening combined with sentiment analysis, we can understand what our customers are saying about us and where we need to improve, both in India and globally. 


Will AI Chip Supply Dry Up and Turn Your Project Into a Costly Monster?

CIOs and other IT leaders face tremendous pressure to quickly develop GenAI strategies in the face of a potential supply shortage. With the cost of individual units, spending can easily reach into the multi-million-dollar range. But it wouldn’t be the first time companies have dealt with semiconductor shortages. During the COVID-19 pandemic, a spike in PC demand for remote work met with global shipping disruptions to create a chip drought that impacted everything from refrigerators to automobiles and PCs. “One thing we learned was the importance of supply chain resiliency, not being overly dependent on any one supplier and understanding what your alternatives are,” Hoecker says. “When we work with clients to make sure they have a more resilient supply chain, we consider a few things … One is making sure they rethink how much inventory do they want to keep for their most critical components so they can survive any potential shocks.” She adds, “Another is geographic resiliency, or understanding where your components come from and do you feel like you’re overly exposed to any one supplier or any one geography.” Nvidia’s GPUs, she notes, are harder to find alternatives for -- but other chips do have alternatives. “There are other places where you can dual-source or find more resiliency in your marketplace.”


WTF? Why the cybersecurity sector is overrun with acronyms

Imagine an organization is in the midst of a massive hack or security breach, and employees or clients are having to Google frantically to translate company emails, memos or crisis plans, slowing down the response. When these acronyms inevitably migrate into a cybersecurity company’s external marketing or communications efforts, they’re almost guaranteed to cause the general public to tune out news about issues and innovations that could have a far-reaching impact on how people live their lives and conduct their businesses. This is especially true as artificial intelligence (AI!) and machine learning (ML!) technologies expand and new acronyms emerge to keep pace with developments. Acronyms can also have unfortunate real-life connotations — point of sale, to name just one example. When shortened to POS, it can suggest something is… well, crappy. ... So, what’s behind the tendency to shorten terms to a jumble of often incomprehensible acronyms and abbreviations? “On the one hand, acronyms, abbreviations and jargon are used to achieve brevity, standardization and efficiency in communication, so if a profession is steeped in complex and technical language, it will likely be flowing with acronyms,” says Ian P. McCarthy, a professor of innovation and operations management at Simon Fraser University in Burnaby, British Columbia.

Daily Tech Digest - November 06, 2022

Best Practices for Enterprise Application Architecture

The iterative approach is a more practical method of building enterprise application architectures, where you start small and build out your architecture in small, incremental steps. This approach is particularly useful for enterprises with limited resources and can’t afford to build a full-scale architecture from scratch. Instead of starting with a full-scale architecture, design and implement a series of smaller “proof-of-concept” applications that prove the feasibility of your ideas. Once these applications are ready, you can scale them into an enterprise-level solution. ... The Agile adoption process is a critical step for any enterprise application, and the implementation of agile methodology can be daunting for organizations that have not done it before. However, there are many benefits to adopting a more agile development approach, which includes delivering software faster and at less cost. ... EA governance is the process involved in managing and maintaining an EA. This includes identifying and defining an EA’s goals, objectives, and key performance indicators (KPIs). It also involves establishing a governance framework that supports the EA’s development, management, and maintenance.


Technology leader shall be open to accepting changes

As systems evolve over time, so does their complexity. Maintaining such systems or components will always be a challenge. While people will be on the move, there shall be an inventory of all the systems and components, along with actively maintained documentation, which many organisations don’t adhere to. Maintainability should be considered as a key consideration while designing and building systems. ... Technology leaders must demonstrate consistent delivery of high-quality services, which necessitates the implementation of appropriate systems and processes. Simultaneously, such systems and processes must not be a barrier to adapting to a changing business and technology ecosystem. ... Like GDPR, many countries are coming up with regulations for data privacy and cyber security requirements. Coping with such demands is necessary, but it is difficult because it is complex, dynamic, and ever-changing. To add to that, establishing a return on investment for security solutions is very challenging.


7 hard truths of business-IT alignment

Some CIOs treat IT-business alignment as their own responsibility. That’s a mistake, experts say. “The leadership team below the CIO also needs to be customer facing. It needs to be able to help solve problems. It shouldn’t just be going away and writing code,” Pettinato says. “To make it scalable, you need to take it beyond just one individual.” “I clearly can’t, myself, be involved in every organizational conversation,” Barchi says. “That is where trusting your team helps. There are many leaders on my team who are in these meetings day in and day out, solving problems in real time.” In fact, he says, creating a team that’s capable of doing this is the most important part of his job. “As I’ve grown as a leader, I’ve recognized that my contribution is not my own technical skill and my ability to make decisions. It’s my ability to create a team that can do all of that,” Barchi says. “I think CIOs do well when we know that our job is not to be involved in every technology decision — it’s to create the environment where that can happen and create a team that can do that.”


Teaching is complicated. But technology in the classroom doesn't have to be

When the pandemic forced schoolchildren to learn from home and adapt to digital learning, educators lost their students' attention, and learning suffered. But once schools opened up, digital learning didn't disappear; for many, it became the norm. Seage says technology should never be the driver of the classroom. "The technology has to complement what you do. It complements all different teaching styles," he says. As a former student, Seage recalls the difficulty teachers faced in finding novel ways to engage students and wanted to offer a solution. Interactive whiteboards offer a low learning curve for teachers and students while also promoting interaction and collaboration, he says. In the school's gymnasium, for instance, boards serve as an enhanced coaching tool, allowing coaches and players to re-watch game footage during practice, or strategize game plays for future matches. Micah Shippee, director of education technology consulting and solutions at Samsung, is a former educator who now works with schools to adopt Samsung technology.


How to Choose the Best Software Architecture for Your Enterprise App?

Patterns in architecture are ways to fix common design problems that can be used repeatedly. Their framework makes it easier to reuse code and keeps apps running smoothly for longer. In addition to being scalable, flexible, and easy to keep up with, the software must be able to handle a wide range of requests without any problems. But making software hard to use could go against these goals because it makes it less likely that people will use the software and use it well. Because of this, the software needs to be very flexible to be changed to meet the needs of each user. ... An event producer and a consumer are the two most important parts of an EDA system. A producer is someone who knows how to put on an event. Put another way, it is up to the person watching the possibility to pay attention to what is happening. Event-driven architecture (EDA) is a way of making software that relies on events to send messages between modules. It breaks applications into small pieces called modules that can run on their own and share data with a small number of other modules using standard protocols. 


Is there a cyber conflict happening behind the scenes?

There’s a global digital dependency happening right now, accelerated even further by the pandemic driving a need for remote services in nearly every industry. While this adaptation is an overall benefit to progressive societies, it opens new and innovative ways for cyber attackers to target organizations and consumers alike. Even those who aren’t connected are inadvertently impacted by the digital world and cyberattacks, which has people around the world asking: is there a cyber battle going on? ... At the beginning of the Russian-Ukrainian conflict earlier this year, Russians attacked a satellite provider in Ukraine, affecting countries including Germany and France and bricking edge devices across the continent. This affected both civilian and military communication, hindering war efforts on the Ukraine side and evacuation efforts for fleeing citizens. These attacks aren’t just being carried out by high-level nation-state actors, they’re also being carried out by hacktivists and volunteers. Even simple distributed denial-of-service (DDoS) attacks can generate damage with the right amount of devices. 


IT Ops 4.0: Operations Architecture For The Industry Automation Age

A structured communication effort is essential to gaining support and motivating employees. Every organization has its way of getting started and doing this, but most steps involve at least three elements. Engage people in alignment with the vision. It’s essential to be transparent about where we are (or where we started), where we’re heading, and how we’re getting there as a team. This is to demonstrate the value of transformation, both for the organization and its employees.It is essential to experience problems, challenges, or innovative approaches. Employees gain a better understanding by learning from the companies leading the way. Tours enable employees to think outside the box, hear stories of change, and discuss the challenges that come with it.When people know that change has been around for a long time and understand why it is happening, they tend to want to learn as much as possible about it. Employers need to gain momentum and make it as easy as possible to provide access to information and resources on new technologies and approaches.


Why Wasm is the future of cloud computing

Wasm is already very capable. And with the new technologies and standards that are on the way, Wasm will let you do even more. ... WASI will provide a standard set of APIs and services that can be used when Wasm modules are running on the server. Many standard proposals are still in progress, such as garbage collection, network I/O, and threading, so you can’t always map the things that you’re doing in other programming languages to Wasm. But eventually, WASI aims to provide a full standard that will help to achieve that. In many ways, the goals of WASI are similar to POSIX. Wasm as it now stands also doesn’t address the ability to link or communicate with other Wasm modules. But the Wasm community, with support from members of the computing industry, is working on the creation of something called the component model. This aims to create a dynamic linking infrastructure around Wasm modules, defining how components start up and communicate with each other (similar to a traditional OS’s process model).


Making the case for security operation automation

Security teams must be able to scale operations to deal with the increasing volume of everything coming at them. Faced with a global cybersecurity skills shortage, CISOs need alternatives to hiring their way out of this quagmire. ... When it comes to security operations process automation, one might equate this activity with security orchestration, automation, and response (SOAR) technology. In some cases, this is a correct assumption, as 37% of organizations use some type of commercial SOAR tools. Interestingly, more than half (53%) of organizations eschew SOAR, using security operations process automation functionality within other security technologies instead – security information and event management (SIEM), threat intelligence platforms (TIPs), IT operations tools, or extended detection and response (XDR), for example. Those organizations using SOAR admit that it is no day at the beach – 80% agree that using SOAR was more complex and time consuming than they anticipated. Technology aside, security professionals acknowledge that there are a few major impediments to security operations process automation. 


Gender has no bearing on your abilities in tech industry

According to statistical information, there is clearly not an equal representation of women in technology. ... The first is stereotyping, conscious & unconscious biases, which occur when people believe that being a woman, may have a negative impact on performance, level of intelligence or aptitude. I believe it began when women were not encouraged to pursue STEM courses – Science, Technology, Engineering and Math. Nowadays, there are concerted efforts and interventions to encourage women to pursue STEM degrees. Secondly, there aren’t enough role models, advocates, and people who are challenging the status quo. Although overall, things have improved significantly in recent years. I do not recall knowing any Nigerian woman in Data Analytics or Business Intelligence when I started my career. I never met them or heard about them. I seriously doubt they existed at the time, which says a lot. The lack of role models at the time was a major factor, but I am glad things are improving now.



Quote for the day:

"Setting an example is not the main means of influencing others, it is the only means." -- Albert Einstein

Daily Tech Digest - January 12, 2021

What industries need to avoid transformation limitations?

Already in 2020, we’ve seen dramatic change thanks to changing consumer habits, a year of online-shopping, a variety of item delivery, pick-up and return models, and store closures. These changes show no signs of slowing down in the years ahead. Likewise, another two industries that are going to undergo a sustained period of innovation-led change are the insurance and transportation industries, respectively. All three will be absorbed by broader, horizontal ecosystems. Although this change will be dramatic and may cause some unrest at first, ultimately, it will mean happier and more loyal customers and corporate leaders who are not under constant strain to reimagine the business. This change is just the tip of the iceberg. Today’s successful CEO would be wise to look at this trio of disappearing industries as canaries in the mineshaft. The evolution from vertically-oriented industries to horizontal ecosystems, constructed from a complex value chain of partners, has begun. Transportation, insurance, and retail represent the three first industries changing at a faster pace than other verticals. Any number of sweeping technological breakthroughs — artificial intelligence (AI), blockchain, the internet of things, and the data-crunching power of advanced analytics — will have a similar impact on other industries.


Addressing the lack of knowledge around pen testing

Pen testing will only be truly effective if it is implemented with the right processes, including both preparation and follow-up. Before carrying out the test, it is important to have the scope and boundaries thoroughly documented. This includes safeguards and processes to cover any issues that might result in discovery, particularly when social engineering and physical security are involved. We provide our team with Get Out of Jail Free cards that explain their purpose and who to contact at the business to avoid a scenario like the Iowa arrest. However, while someone at the organization must be aware of everything the pen testers may be doing, it would be ideal that as few people as possible know about it. It’s also important to have a clear strategy for following up once the pen test results are in. Organizations are often fixated on the number of issues a pen test uncovers (usually a greater number than they were expecting). This information alone is useless, and priority should be given to implementing a plan of action to close those gaps. Given the huge variation of potential threats, the results of a pen test can feel overwhelming and dispiriting. 


Data Science Learning Roadmap for 2021

A significant part of data science work is centered around finding apt data that can help you solve your problem. You can collect data from different legitimate sources — scraping (if the website allows), APIs, Databases, and publicly available repositories. Once you have data in hand, an analyst will often find themself cleaning dataframes, working with multi-dimensional arrays, using descriptive/scientific computations, and manipulating dataframes to aggregate data. Data are rarely clean and formatted for use in the “real world”. Pandas and NumPy are the two libraries that are at your disposal to go from dirty data to ready-to-analyze data.... Data engineering underpins the R&D teams by making clean data accessible to research engineers and scientists at big data-driven firms. It is a field in itself and you may decide to skip this part if you want to focus on just the statistical algorithm side of the problems. Responsibilities of a data engineer comprise building an efficient data architecture, streamlining data processing, and maintaining large-scale data systems. Engineers use Shell (CLI), SQL, and Python/Scala to create ETL pipelines, automate file system tasks, and optimize the database operations to make them high-performance.


Donkey: A Highly-Performant HTTP Stack for Clojure

Clojure makes writing concurrent applications easy. It frees the developer from the implications of sharing state between threads. It does so by using immutable data structures, as described by Rich Hickey in his talk The Value of Values: If you have a value, if you have an immutable thing, can you give that to somebody else and not worry? Yes, you don't have to worry. Do they have to worry about you now because they both now refer to the same value? Anybody have to worry? No. Values can be shared. Because all objects are immutable, they can be concurrently accessed from multiple threads without fear of lock contention, race conditions, proper synchronization, and all the other “fun” stuff that makes writing concurrent programs so difficult to get right. The downside is that every mutating operation produces a new object with an updated state. An inefficient implementation would cause a great deal of CPU time to be wasted on copying and creating new objects and, as a result, longer and more frequent GC cycles. Fortunately, Clojure uses a Hash Array Mapped Trie (HAMT) to model its data structures internally. By sharing structures that do not change, and copying only what does, it maintains immutability and thread-safety - and does so at a minimal cost.


The UK’s struggle with digital schooling

“There is a huge digital divide and it is getting worse with schools being shut down due to Covid-19. Teachers and school leaders are trying their best to continue with online teaching by providing resources, virtual check-ins and recorded lessons,” said EdTech adviser and consultant Joysy John, who added that many children cannot access these services due to a lack of technology or connectivity. “There are many new initiatives like Oak National Academy, National Tutoring Programme and free resources from Edtech companies, but these benefit those who already have digital access. So the digital divide is going to get wider unless the government thinks of a more holistic approach and provides disadvantaged parents with additional financial and educational support.” Once the lockdown was announced, education secretary Gavin Williamson outlined a number of plans for remote education, including the mandate for schools to provide a set number of hours of “high-quality remote education for pupils”. This is of no help to those without access to online learning, so the government has tried to address the digital divide causing disparity in home schooling during pandemic lockdowns by giving laptops to those from under-privileged backgrounds – something it began doing in the UK’s first lockdown.


SolarWinds Hack Lessons Learned: Finding the Next Supply Chain Attack

It is interesting to note that FireEye's initial detection of the SolarWinds compromise didn't find complex lateral movement, or even data exfiltration. What triggered FireEye's deeper investigation, according to reports, was an unusual remote user login from a previously unknown device with an IP address in a suspect location. It was only upon further review that FireEye discovered the intrusion and ultimately traced it back to SolarWinds. This scenario, now all too real for thousands of enterprises around the world, underscores the importance -- if not necessity -- of having behavioral analytics as a key component of contemporary enterprise cybersecurity product architectures. Behavioral analytics supercharges threat detection by not only analyzing event input based on activity from users and devices, but also by using machine learning, statistical analysis and behavioral modeling to correlate and enrich events. World-class behavioral analytics technology can factor in a wide variety of data points -- such as peer groups, IP association, personal email addresses, and kinetic identifiers like badge reader activity -- to identify a malicious intrusion by stitching together a half dozen or more events that, by themselves, would seem benign.


How IT must adapt to the emerging hybrid workplace

The implications for IT are many: extended support desk hours; remote-support and remote-management tools; work-specific user training; cloud enablement of all software possible; appropriate security for distributed work; enabling multiple forms of collaboration and related activities like scheduling, whiteboarding, and availability tracking; provisioning equipment to home-based workers and/or supporting employee-provided equipment; aiding Facilities in modernizing building technologies to avoid touch-heavy surfaces; and partnering more closely with HR for policy enablement and enforcement and for appropriate monitoring. ... The workforce will not all work in the traditional office or company location, nor will they all be remote. Many people will work from home, but many people still need to work in a corporate facility, such as a production line, data center, retail store, shipping center, lab, or even traditional office. And there are employees whose work is location-agnostic but who can’t work at home due to lack of space or insufficient internet access. Gartner’s Adnams estimates that — although it varies by industry — about half of the workforce in advanced economies will need to work in a corporate facility, 25% to 30% will work permanently at home


Spotlight on home-office connectivity intensifies in 2021

"As the pandemic wears on, we are seeing organizations solidifying their plans for remote working, including adding more sophisticated hardware and software for work from home, with primary drivers including security and productivity," said Neil Anderson, senior director of network solutions at World Wide Technology, a technology and supply chain services company. "For IT, this means quickly assessing and deploying new cloud-based security models and building trust quickly in a solution," Anderson said. "We're also seeing a lot of interest in experience monitoring and optimizing software to put better analytics in place around what the home-office employee app performance is like and how to make it better." While individuals have limited options to speed up their home-office connectivity, IT can step in to provide enterprise-grade services to high-value workers for whom every minute with clients, customers, and coworkers counts, wrote Jean-Luc Valente, Cisco vice president, product management, enterprise routing and SD-WAN, in a blog post about the future of home office connectivity. "The high-value workforce needs superior connectivity that makes working at home just as fluid as being in the office with consistent connectivity and performance. ... " Valente stated.


Competition and Markets Authority battles with cookies and privacy

The CMA said it had been considering how best to address legitimate privacy concerns without distorting competition in discussions of the proposals with the Information Commissioner’s Office (ICO), through the Digital Regulation Cooperation Forum. As part of this work, the CMA said it had been in discussions with Google to gain a greater understanding of the proposed browser changes. The current investigation will provide a framework for the continuation of this work, and, potentially, a legal basis for any solution that emerges. Andrea Coscelli, chief executive of the CMA, said: “As the CMA found in its recent market study, Google’s Privacy Sandbox proposals will potentially have a very significant impact on publishers like newspapers, and the digital advertising market. But there are also privacy concerns to consider, which is why we will continue to work with the ICO as we progress this investigation, while also engaging directly with Google and other market participants about our concerns.” The CMA said it has an open mind and has not reached any conclusions at this stage as to whether competition law has been infringed.


Verizon CEO Talks 5G, Drones, and Compute at the Edge at CES

The move to the higher capacity broadband standard has been trumpeted by others as the beginning of a new frontier with huge amounts of data moving wirelessly. Vestberg said the speed of 5G would reveal new possibilities that transform the world from playing video games to receiving deliveries. “Mobile edge compute will allow businesses to get things done more quickly and easily,” he said. Vestberg talked up the upload and download speeds of Verizon’s 5G Ultra Wideband network, which he said sees peak throughputs of at least 10 times faster that the 4G standard and more than 4 gigabits under ideal conditions. The extremely low lag of 5G, Vestberg said, could eventually make extremely delicate procedures such as remote surgery possible. He also expects the new broadband standard to ramp up the population of connected wireless devices. “In the future, 5G could support more devices than ever before,” Vestberg said. “Up to one million per square kilometer.” The wireless connections could also be support on devices moving more than 500 kmph, he said, allowing users to maintain signal on highspeed vehicles such as commuter trains, aerial drones, or self-driving cars.



Quote for the day:

"Authority without wisdom is like a heavy ax without an edge -- fitter to bruise than polish." -- Anne Bradstreet

Daily Tech Digest - January 15, 2019

Coding, cloud skills are most in demand for network pros

Computerworld Tech Forecast 2017 - Hottest Tech Skills for 2017
The premise of incorporating development know-how with operations skills isn’t new and often falls under the umbrella of DevOps, a process methodology that encompasses software development and IT operations teams working more closely together from design to production. The benefits are said to include software that works better and as expected on the production network because the operations team shared insights with developers. ... Another network-specific security skill is traffic scrubbing. This quality of service prioritization puts filters in place to find offensive traffic, mitigate it and protect the remaining network without losing access to the Internet, CompTIA’s Stanger explains. Network professionals are being tasked by their CIOs to fulfill security roles in part due to trends such as IoT and cloud. Another factor only network managers could understand is the impending reality of IPv6.



Tick-tock: The year-long Windows 7 countdown by the numbers

windows 7 logo in the rear view mirror
36, the percentage of all Windows PCs that will run Windows 7 at its retirement, based on a rolling 12-month average of change tracked by Net Applications; that average was then projected into the future. The number has fluctuated significantly over the last two years, from a low of 29% to nearly 40%. It's also the maximum number of months Microsoft will offer corporate customers "Windows 7 Extended Security Updates" (ESU) after the January 2020 support retirement. The extended support will be available only for PCs running Windows 7 Professional or Windows 7 Enterprise, and then only if those operating systems were obtained via a volume licensing deal. Microsoft will discount ESU for customers who also have Software Assurance plans in place for Windows or have subscriptions to Windows 10 Enterprise or Windows 10 Education, including the Microsoft 365 subscription. Windows 7 ESU will be sold in 12-month increments, with as many as two extensions of the additional-support plan.


What you must know about moving ERP to the cloud

What you must know about moving ERP to the cloud
The migration of critical business applications is happening right now for several reasons. First, hardware leases are up for renewal, or upgrades need to occur to move to the next generation of ERP or other critical applications. So, the ERP providers are showing up with new software and new compute requirements that are also growing, and this means more hardware procurement and data center space for IT. That cost is becoming prohibitive. With today’s public cloud alternatives, the issue is not if you think cloud is safe or not, it’s that you can’t afford the on-premises alternative. Second, the sky has not fallen. A few years ago, naysayers predicted outages, breaches, the Zombie Apocalypse, and so forth as a consequence of cloud migration—none of which happened at a noticeable scale. So, those who pushed back on cloud computing based on the impending-doom argument are no longer listened to, or they were moved out of IT leadership.


Robust data governance is key for machine learning success

Industry pundits speculate about machine learning algorithms being a potential ‘Black Box’, primarily due to the scepticism around trusting an ecosystem which exhibits limited transparency to its data compliance and decision making processes. The global data analyst community has helped design semi or fully-automated analytics systems that are AI or ML driven. However, the core and often-niggling issue of data quality may always prevail. Add to this, the multifarious and disparate data sources, immense data volumes, and unstructured data types that augment the already existing data management problems, especially those relating to data governance. As ML gains momentum and continues to be at the forefront of transforming the way organizations operate, it may be advisable to exercise some caution. In the absence of robust data governance processes, the zeal to allow ML to take over the decision-making process entirely has the potential to unleash some critical issues – unreliable and misleading information and unexpected expense overheads.


Tech usage in school more likely in UK than Germany


Darren Fields, regional director of UK and Ireland at Citrix, said the UK is making progress in the promotion of science, technology, engineering and mathematics (Stem) subjects, but that more needs to be done to keep ahead of growing skills gaps. “As a nation, it’s critical that we continue to invest in future generations, encouraging greater engagement with technology and creating a culture whereby young people are eager to get involved with and learn more about Stem subjects,” he said. “Employers currently report a significant tech skills gap, and the next generation of tech-savvy workers will be vital in helping to close this.” Fields highlighted the need for ensuring the UK is producing the technology talent to match the UK’s technology “ambitions”, and said education is the “start of the pathway” for ensuring this outcome.


How Employees of the Future Will Be Different


"The employee of the future might have many careers, skill sets and expertise she wants to pursue--all at the same time," says Wong. "Her 'boundless self' means that that she might be more interested in creating an increasingly complex, non-linear career journey, filled with her many interests and experiences. [...She] may no longer imagine herself in one role, company or career track for the rest of her life; instead; she might look to reinvent both herself and her career continuously, often at the same time." ... "Traditionally, employees have experienced much less flexibility in how they work, collaborate, and communicate. But for the employee of the future, endless choice in how, where, and with whom they work will increasingly be the norm. This can create anxiety on how to make the right choice. So, companies should seek to help these employees navigate the sea of options by offering clear and simple guidelines or ways to navigate to the best decision."


5 Important Augmented And Virtual Reality Trends For 2019 

5 Important Augmented And Virtual Reality Trends For 2019 Everyone Should Read
Computer vision – an AI (artificial intelligence) technology which allows computers to understand what they are “seeing” through cameras, is essential to the operation of AR, allowing objects in the user's field of vision to be identified and labeled. We can expect the machine learning algorithms that enable these features to become increasingly sophisticated and capable. The Snapchat and Instagram filters we are used to, to, e.g. overlay bunny ears and cat whiskers on selfies, are a very consumer-facing application of AI tech combined with AR. Their popularity in these and various other applications of image enhancement functionality isn’t likely to dwindle in 2019. For more scientific use cases, there’s Google’s machine learning-enabled microscope to look forward to, which can highlight tissue which it suspects could be a cancerous tumor growth as a pathologist is looking at samples through the viewfinder. VR is about putting people inside virtual environments and those environments – and their inhabitants – are likely to become increasingly intelligent over the next year.


Artificial Intelligence: Bright Future or Dark Cloud?


There is a fierce debate on campuses and in boardrooms about the life-altering effects of AI. Elon Musk has warned of a “fleet of artificial intelligence-enhanced robots capable of destroying mankind”, while Larry Page of Google and Alphabet foresees advancements in human progress. I believe there is merit in both arguments, and the good news is that we have time to shape AI in a positive direction. In human terms, we are in the toddler stage in the development of AI--a period of rapid neurogenesis. A child’s early years are shaped by external stimuli like pictures, music, language, and of course, human interaction. The result of this neurogenesis will determine a person’s intelligence, compassion, thoughtfulness and, importantly, capacity for empathy. Similarly, for AI to evolve in a positive direction, we need to involve the humanities, law, ethics as well as engineering. We need diversity of thought amongst the people working on these solutions. I know others share this view.


API integration becomes an enterprise priority


The pre-built integration templates in API integration products bring quick connectivity between previously siloed cloud applications. These packaged integrations also help with self-service deployment for line-of-business employees, increasing the speed and reducing labor costs of integration. Those attributes led Humantelligence, which offers an AI-driven recruiting and culture-analytics platform, to adopt API integration. Juan Luis Betancourt, Humantelligence's CEO, sought automated integration capabilities to connect the company's app environments with customers' cloud and homegrown apps, particularly their applicant-tracking applications. After evaluating five products, Betancourt implemented Jitterbit Harmony iPaaS. This API integration platform helps his company quickly connect SaaS, on-premises and cloud applications. "The iPaaS solution provides the built-in integrations and automated tools we need to navigate the complexities of API integration," he said.


Insider threats will dominate cybersecurity trends in 2019

The proliferation of SaaS applications is giving insiders more ways to exfiltrate data, and this trend shows no signs of slowing down – in fact, SaaS spending is expected to double by 2020. Accidental and purposeful exfiltration insiders will take advantage of multiple new channels to exfiltrate data and hide their tracks ... Insider threat statistics from the Ponemon Institute show that two out of three insider threat incidents happen by accident. While malicious insider threats tend to capture more of the headlines, far too many incidents are accidental and could have been prevented. Organizations will take more initiative to gain insight into the context behind insider threat incidents, including user intent. This level of context can help cybersecurity teams stop user mistakes before they become full-blown breaches. As such, more organizations will adopt ongoing insider threat training as a company-wide cybersecurity awareness initiative



Quote for the day:


"No persons are more frequently wrong, than those who will not admit they are wrong." -- François de La Rochefoucauld


Daily Tech Digest - November 11, 2018

broken web app hacker
Web applications are the most visible front door to any enterprise and are often designed and built without strong security in mind. Stressing out over hardware vulnerabilities like Spectre or Meltdown is fun and trendy, but while you're digging a moat around your castle someone is prancing across the drawbridge using SQL injection (SQLi) or cross-site scripting (XSS). The OWASP Broken Web Applications Project comes bundled in a virtual machine (VM) that contains a large collection of deliberately broken web applications with tutorials to help students master the various attack vectors. From trivial to more difficult, the project is designed to lead the user to a better understanding of web application security. The OWASP Broken Web Applications Project includes the appropriately named Damn Vulnerable Web Application, deliberately broken for your pentesting enjoyment. For maximum lulz, download OWASP Zed Attack Proxy, configure a local browser to proxy traffic through ZAP, and get ready to attack some damn vulnerable web applications.



Emotional skill is key to success

According to Susan David, emotional agility is about adaptability, facing emotions and moving on from them. It is also the ability to master the challenges life throws at us in an increasingly complex world. She added that while emotional intelligence is not values-focused, emotional agility is. "Women do have some advantages in the domain of emotional agility," she said. "When I go into organisations and look at hotspots or business units that are extremely high functioning, what we find is that the most important predictor of enabling these units is what I call 'individualised considerations'. That means leaders who are able to see the individual as an individual and this has diversity at its core. "These leaders do not stereotype or exclude," she added. "Of course, this doesn't work always in practice and there is a lot of work to be done in this regard in organisations and businesses."


Hybrid Blockchain- The Best Of Both Worlds

Hybrid Blockchain
The hybrid blockchain is best defined as the blockchain that attempts to use the best part of both private and public blockchain solutions. In an ideal world, a hybrid blockchain will mean controlled access and freedom at the same time. The hybrid blockchain is distinguishable from the fact that they are not open to everyone, but still offers blockchain features such as integrity, transparency, and security. As usual, Hybrid blockchain is entirely customizable. The members of the hybrid blockchain can decide who can take participation in the blockchain or which transactions are made public. This brings the best of both worlds and ensures that a company can work with their stakeholders in the best possible way. We hope that you got a clear view from the hybrid blockchain definition. To get a much better picture, we recommend you to check out some hybrid blockchain projects.


How universities should teach blockchain


The core issue is that blockchain is really hard to teach correctly. There’s no established curriculum, few textbooks exist, and the field is rife with misinformation, making it hard to know what is credible. Protocols are evolving at a rapid pace, and it’s tough to tell the difference between a white paper and reality. Having so much attention around blockchain specifically frames it as a miraculous and novel development rather than an outgrowth of decades of computer science research. Matt Blaze, an associate professor at the University of Pennsylvania and a cyber-security researcher, points out that the push for degree programs in blockchain is part of a trend of overspecialization by some engineering schools. The concepts sound good on paper but don’t live up to their promise. Despite the best of intentions, trends change, and students get stuck in narrow career paths. In order to avoid these pitfalls, universities will have to take an approach they’re not used to.


Experience an RDP attack? It’s your fault, not Microsoft’s

Windows security and protection [Windows logo/locks]
If you are compromised because of RDP, the problem is you or your organization. It isn’t a problem with Microsoft or RDP. You don’t need to put a VPN around RDP to protect it. You don’t need to change default network ports or some other black magic. Just use the default security settings or implement the myriad other security defenses you should have already been using. If you’re getting hacked because of RDP, you’re not doing a bunch of things that any good computer security defender should be doing. There are many ransomware programs, like SamSam, and cryptominers, like CrySis, that attempt brute-force guessing attacks against accessible RDP services. So many companies have had their RDP services compromised that the FBI and Department of Homeland Security (DHS) have issued warnings. The warning should be, “Your security sucks!” It isn’t like the malware programs are conducting a zero-day attack against some unpatched vulnerability.


Data as a Driver of Economic Efficiency

The General Data Protection Regulation (GDPR) became enforceable on May 25, 2018. The regulation aims to protect data by ‘design and default,’ whereby firms must handle data according to a set of principles. GDPR mandates opt-in consent for data collection and assigns substantial liability risks and penalties for data flow and data processing violations. GDPR’s enactment is particularly likely to influence technology ventures, given an increasing need for the use of data as a core product input. Specifically, data has become a key factor in technology-driven innovation and production, spanning industry sectors from pharmaceuticals and healthcare, to automative, smart infrastructure, and broader decision making. This report presents economic analyses of the consequences of data regulation and opt-in consent requirements for investment in new technology ventures, for consumer prices, and for economic welfare.


A Two-Minute Guide To Quantum Computing

AP Explains Quantum Computers
Most of us aren't clued up on the art of harnessing elementary particles like electrons and photons, so to understand how quantum computing works, meet Scottish startup M Squared. The company’s bread and butter is making some of the most accurate lasers in the world, using pure light and precise wavelengths. Such lasers can be used like a scalpel, one atom wide, to carve out the transistors of a silicon chip.  Typically the chip or brain in your smartphone is a centimeter square. It has a small section in the middle made up of around 300 million transistors, with connections spreading out like fingers to talk to the screen, the camera, the battery and more.  But imagine a chip with no transistors at all, and instead a small chamber that’s controlling the processes and energy levels inside of atoms. This is quantum computing, the next frontier of machines that think not in bytes but in powerful qubits. It sounds cutting-edge, but scientists have been studying the theory of quantum computing for 30 years, and some say the first mainstream applications are just around the corner.


How Do Self-Driving Cars See? (And How Do They See Me?)


We’ll start with radar, which rides behind the car’s sheet metal. It’s a technology that has been going into production cars for 20 years now, and it underpins familiar tech like adaptive cruise control and automatic emergency braking. ... The cameras—sometimes a dozen to a car and often used in stereo setups—are what let robocars see lane lines and road signs. They only see what the sun or your headlights illuminate, though, and they have the same trouble in bad weather that you do. But they’ve got terrific resolution, seeing in enough detail to recognize your arm sticking out to signal that left turn. ... If you spot something spinning, that’ll be the lidar. This gal builds a map of the world around the car by shooting out millions of light pulses every second and measuring how long they take to come back. It doesn’t match the resolution of a camera, but it should bounce enough of those infrared lasers off you to get a general sense of your shape. It works in just about every lighting condition and delivers data in the computer’s native tongue: numbers.



Facial recognition's failings: Coping with uncertainty in the age of machine learning

The shortcomings of publicly available facial-recognition systems were further highlighted in summer this year, when the American Civil Liberties Union (ACLU) tested the AWS Reckognition service. The test found that 28 members of the US Congress were falsely matched with mug shots from publicly available arrest photos. Professor Chris Bishop, director of Microsoft's Research Lab in Cambridge, said that as machine learning technologies were deployed in different real-world locales for the first time it was inevitable there would be complications. "When you apply something in the real world, the statistical distribution of the data probably isn't quite the same as you had in the laboratory," he said. "When you take data in the real world, point a camera down the street and so on, the lighting may be different, the environment may be different, so the performance can degrade for that reason. "When you're applying [these technologies] in the real world all these other things start to matter."


Robots Have a Diversity Problem


It is well-documented that A.I. programs of all stripes inherit the gender and racial biases of their creators on an algorithmic level, turning well-meaning machines into accidental agents of discrimination. But it turns out we also inflict our biases onto robots. A recent study led by Christoph Bartneck, a professor at the Human Interface Technology Lab at the University of Canterbury in New Zealand, found that not only are the majority of home robots designed with white plastic, but we also actually have a bias against the ones that are coated in black plastic. The findings were based on a shooter bias test, in which participants were asked to perceive threat level based on a split-second image of various black and white people, with robots thrown into the mix. Black robots that posed no threat were shot more than white ones. “The only thing that would motivate their bias [against the robots] would be that they would have transferred their already existing racial bias to, let’s say, African-Americans, onto the robots,” Bartneck told Medium. “That’s the only plausible explanation.”



Quote for the day:


"Remember this: Anticipation is the ultimate power. Losers react; leaders anticipate." -- Tony Robbins


Daily Tech Digest - April 29, 2018

Institutional Innovation: How blockchain could transform student ROI


Colleges and universities are recognizing that degrees are much like currency. They are sheets of paper that serve as an exchange with employers to signal the graduate has the types of skills that are necessary for the job. The better the degree, the more value a student may have in the workforce. By moving degrees into a form of digital record where the student can own it as a type of currency, rather than the institution holding it, they can put that currency into a massive decentralized network, much like bitcoin. This would allow employers to see students' records more easily. Feng Hou, CIO of Central New Mexico Community College, explained that his institution's decision to look into blockchain technology came from an initiative to convert college-owned technology into student-owned technology — with one of those areas being digital credentials and transcripts. Central New Mexico Community College, working with a vendor called “Learning Machine,” developed an open-source platform where digital diplomas could be recorded and shared in major professional networks.



Google Co-Founder Sergey Brin Warns Of AI's Dark Side

AI tools might change the nature and number of jobs, or be used to manipulate people, Brin says—a line that may prompt readers to think of concerns around political manipulation on Facebook. Safety worries range from “fears of sci-fi style sentience to the more near-term questions such as validating the performance of self-driving cars,” Brin writes. All that might sound like a lot for Google and the tech industry to contemplate while also working at full speed to squeeze profits from new AI technology. Even some Google employees aren’t sure the company is on the right track—thousands signed a letter protesting the company’s contract with the Pentagon to apply machine learning to video from drones. Brin doesn’t mention that challenge, and wraps up his discussion of AI’s downsides on a soothing note. His letter points to the company’s membership in industry group Partnership on AI, and Alphabet’s research in areas such as how to make learning software that doesn’t cheat), and AI software whose decisions are more easily understood by humans.


3 Innovative Ways Blockchain Will Build Trust In The Food Industry

Just look to Chipotle. After a major E. coli breakout in late 2015, the company’s profits dropped 44% compared with the same quarter the previous year. It has since given out millions of coupons to lure customers back with free food, but the company still hasn’t fully restored customer trust. There is a way to increase trust in the food industry. Blockchain solutions are already up and running in other industries like pharma and gold production—and they are ready to be applied to the food space. Every year, one in 10 people around the world become ill due to foodborne diseases, and approximately 420,000 of them die. Part of the reason we still see statistics like this is because it takes far too long to isolate product recall or contamination issues in the supply chain. Right now, IBM and Walmart are working on a solution for this. They’re improving Walmart’s food tracking abilities in China. Under the company’s current system, the pair estimated that it took days—even weeks—for Walmart to track a package of mangos from the farm to the store.


Your next coworker soon may be an avatar humanoid robot


Avatar robots are still experimental, but if the market for collaborative robots is any indication, there could be significant demand. Also known as cobots, collaborative robots are covered with soft materials and can work alongside people in assembly and other jobs. The market for cobots is expected to grow to $12 billion by 2025, according to Barclays Equity Research. Remote operation of robots for work outside the factory, however, is already well established. Intuitive Surgical, for instance, has sold over 4,200 of its da Vinci surgical robots, which reproduce a surgeon's hand motions through small incisions in a patient's body during operations such as hysterectomies; benefits may include shorter recoveries. Many workers around the world may be concerned about losing their jobs to automation, but the risk varies from country to country. A recent OECD study estimates that 33 percent of jobs in Slovakia are "highly automatable", but only 6 percent in Norway, though the authors caution that "the actual risk of automation is subject to significant variation."


Why A Per-App Approach to Application Services Matters

app svcs dev wants soad18_thumb[2]
The problem is that most of these application services are delivered in a shared infrastructure model. Each application gets its own “virtual representation” but it physically resides on a shared piece of software or hardware. This can cause real problems – and is in part a source of the friction that remains between IT and app dev. It’s this shared nature of systems that brought us change windows and review boards and Saturday night deploys (with pizza, to keep us placated) – the processes that slow down development and make deployment a frustrating experience for all involved.  We’re no longer deploying monolithic monster apps. Even if we haven’t gone manic microservices and decomposed apps into hundreds of little services, we still have more apps that are on more frequent deployment schedules. Apps that are developed in week-long sprints rather than year-long projects, and need to push updates faster and more frequently. That, ultimately, is more of the reason (public) cloud has been so successful. Because it’s my app and my infrastructure and I don’t have to wait for Bob or Alice or John before I push an update.


Three Ways Machine Learning Is Improving The Hiring Process


Technology’s advance into all industries and jobs tends to send ripples of worry with each evolution. It started with computers and continues with artificial intelligence, machine learning, IoT, big data and automation. There are conflicting views on how new technology will impact the future of jobs. But it's becoming clear that humans will need to work with technology to be successful -- especially as it relates to the hiring process. There’s a great example of this explained by Luke Beseda and Cat Surane, talent partners for Lightspeed Ventures. On a recent Talk Talent To Me podcast episode, they spoke with the talent team at Hired, where I work, about why it's critical to understand why a candidate is pursuing a given job. They concluded that machines can’t properly manage the qualitative aspect of hiring. For example, machines can’t tell if a candidate is seeking higher compensation or leveraging a job offer to negotiate new terms with their current employer. Humans can. However, machines are better at making processes more efficient.


Data and digital infrastructure key to genomic sequencing success, say MPs

Giving evidence to the committee, professor Sian Ellard of the South West NHS Genomic Medicine Centre said it was unrealistic to expect “all of the planned infrastructure to be in place” for the launch of the genomic medicine service.  “Significant digital infrastructure is needed to support routine genomic medicine, and it is welcome that some centres and hospitals already have solutions in place. However, the wider programme to improve NHS infrastructure is running to a later timeframe than the planned genomic medicine service,” the committee’s report said. “The digital infrastructure in place should be one consideration involved in decisions on providing whole genome sequencing in place of conventional alternative diagnostic tests, to avoid attempting to roll out a Genomic Medicine Service at a speed that cannot be delivered.” Committee chair, Norman Lamb, said that the new service “could dramatically improve the health outcomes of UK citizens, but that the committee is concerned its potential is threatened by delays to digital projects.


How Intel's 8th-gen CPUs will affect budget gaming laptops

acer predator helios 300 1
Intel’s 8th-gen “Coffee Lake” mobile CPUs arrived en masse this month, packing more cores and higher performance than ever before. What does that mean for budget gaming laptops? If you’ve been waiting for the prices of gaming laptops to plunge now that next-gen processors are here, prepare to be a little disappointed. Prices of older laptops generally don’t drop too much when the next big thing shows up. The reasons vary, but in general, PC vendors typically manage inventories fairly tightly to avoid being left with a lot full of Oldsmobiles when the new models come in. That’s not always the case though, and sometimes you’ll find some nice deals if you know where—and when—to look. Discounts on older hardware isn’t the only way Intel’s 8th-gen CPUs will affect budget gaming laptops though. Beyond straightforward discounts, it’s also worth keeping in mind that with the 8th-generation of Intel processors, you’re essentially getting yesteryear’s Core i7 performance in today’s Core i5 chips—and at Core i5 prices too.


How to Increase Backup and Recovery? – Rubrik Briefing Note

Most data protection solutions today comprises two distinct components; the backup software and the backup hardware. The software moves data from production storage to backup storage. It also manages critical factors like ensuring the online backup of applications, as well as locating protected data when necessary, and rapid data recoveries. Data protection hardware typically focuses on cost-effectively storing data for an extended time frame. Ironically, other than the move from tape to disk, most data protection hardware solutions have not invested in making sure that the recovery process is fast. While some backup software vendors have come out with backup appliances, these solutions are typically just pre-installed versions of their software on a set piece of hardware. There is seldom any optimization for leveraging those aspects of that hardware. IT needs a new approach; one that more seamlessly integrates backup hardware and software into a single solution where the software takes full advantage of the hardware and creates an environment specific to data protection.


What Will Our Society Look Like When Artificial Intelligence Is Everywhere?

Imagine you are a woman in search of romance in this new world. You say, “Date,” and your Soulband glows; the personal AI assistant embedded on the band begins to work. The night before, your empathetic AI scoured the cloud for three possible dates. Now your Soulband projects a hi-def hologram of each one. It recommends No. 2, a poetry-loving master plumber with a smoky gaze. Yes, you say, and the AI goes off to meet the man’s avatar to decide on a restaurant and time for your real-life meeting. Perhaps your AI will also mention what kind of flowers you like, for future reference. After years of experience, you’ve found that your AI is actually better at choosing men than you. It predicted you’d be happier if you divorced your husband, which turned out to be true. Once you made the decision to leave him, your AI negotiated with your soon-to-be ex-husband’s AI, wrote the divorce settlement, then “toured” a dozen apartments on the cloud before finding the right one for you to begin your single life.



Quote for the day:


"Many people think great entrepreneurs take risks. Great entrepreneurs mitigate risks." -- Jal Tucher