Showing posts with label user interface. Show all posts
Showing posts with label user interface. Show all posts

Daily Tech Digest - May 29, 2024

Algorithmic Thinking for Data Scientists

While data scientists with computer science degrees will be familiar with the core concepts of algorithmic thinking, many increasingly enter the field with other backgrounds, ranging from the natural and social sciences to the arts; this trend is likely to accelerate in the coming years as a result of advances in generative AI and the growing prevalence of data science in school and university curriculums. ... One topic that deserves special attention in the context of algorithmic problem solving is that of complexity. When comparing two different algorithms, it is useful to consider the time and space complexity of each algorithm, i.e., how the time and space taken by each algorithm scales relative to the problem size (or data size). ... Some algorithms may manifest additive or multiplicative combinations of the above complexity levels. E.g., a for loop followed by a binary search entails an additive combination of linear and logarithmic complexities, attributable to sequential execution of the loop and the search routine, respectively.


Job seekers and hiring managers depend on AI — at what cost to truth and fairness?

The darker side to using AI in hiring is that it can bypass potential candidates based on predetermined criteria that don’t necessarily take all of a candidate’s skills into account. And for job seekers, the technology can generate great-looking resumes, but often they’re not completely truthful when it comes to skill sets. ... “AI can sound too generic at times, so this is where putting your eyes on it is helpful,” Toothacre said. She is also concerned about the use of AI to complete assessments. “Skills-based assessments are in place to ensure you are qualified and check your knowledge. Using AI to help you pass those assessments is lying about your experience and highly unethical.” There’s plenty of evidence that genAI can improve resume quality, increase visibility in online job searches, and provide personalized feedback on cover letters and resumes. However, concerns about overreliance on AI tools, lack of human touch in resumes, and the risk of losing individuality and authenticity in applications are universal issues that candidates need to be mindful of regardless of their geographical location, according to Helios’ Hammell.


Comparing smart contracts across different blockchains from Ethereum to Solana

Polkadot is designed to enable interoperability among various blockchains through its unique architecture. The network’s core comprises the relay chain and parachains, each playing a distinct role in maintaining the system’s functionality and scalability. ... Developing smart contracts on Cardano requires familiarity with Haskell for Plutus and an understanding of Marlowe for financial contracts. Educational resources like the IOG Academy provide learning paths for developers and financial professionals. Tools like the Marlowe Playground and the Plutus development environment aid in simulating and testing contracts before deployment, ensuring they function as intended. ... Solana’s smart contracts are stateless, meaning the contract logic is separated from the state, which is stored in external accounts. This separation enhances security and scalability by isolating the contract code from the data it interacts with. Solana’s account model allows for program reusability, enabling developers to create new tokens or applications by interacting with existing programs, reducing the need to redeploy smart contracts, and lowering costs.


3 things CIOs can do to make gen AI synch with sustainability

“If you’re only buying inference services, ask them how they can account for all the upstream impact,” says Tate Cantrell, CTO of Verne, a UK-headquartered company that provides data center solutions for enterprises and hyperscalers. “Inference output takes a split second. But the only reason those weights inside that neural network are the way they are is because of massive amounts of training — potentially one or two months of training at something like 100 to 400 megawatts — to get that infrastructure the way it is. So how much of that should you be charged for?” Cantrell urges CIOs to ask providers about their own reporting. “Are they doing open reporting about the full upstream impact that their services have from a sustainability perspective? How long is the training process, how long is it valid for, and how many customers did that weight impact?” According to Sundberg, an ideal solution would be to have the AI model tell you about its carbon footprint. “You should be able to ask Copilot or ChatGPT what the carbon footprint of your last query is,” he says. 


EU’s ChatGPT taskforce offers first look at detangling the AI chatbot’s privacy compliance

The taskforce’s report discusses this knotty lawfulness issue, pointing out ChatGPT needs a valid legal basis for all stages of personal data processing — including collection of training data; pre-processing of the data (such as filtering); training itself; prompts and ChatGPT outputs; and any training on ChatGPT prompts. The first three of the listed stages carry what the taskforce couches as “peculiar risks” for people’s fundamental rights — with the report highlighting how the scale and automation of web scraping can lead to large volumes of personal data being ingested, covering many aspects of people’s lives. It also notes scraped data may include the most sensitive types of personal data (which the GDPR refers to as “special category data”), such as health info, sexuality, political views etc, which requires an even higher legal bar for processing than general personal data. On special category data, the taskforce also asserts that just because it’s public does not mean it can be considered to have been made “manifestly” public — which would trigger an exemption from the GDPR requirement for explicit consent to process this type of data.


Avoiding the cybersecurity blame game

Genuine negligence or deliberate actions should be handled appropriately, but apportioning blame and meting out punishment must be the final step in an objective, reasonable investigation. It should certainly not be the default reaction. So far, so reasonable, yes? But things are a little more complicated than this. It’s all very well saying, “don’t blame the individual, blame the company”. Effectively, no “company” does anything; only people do. The controls, processes and procedures that let you down were created by people – just different people. If we blame the designers of controls, processes and procedures… well, we are just shifting blame, which is still counterproductive. ... Managers should use the additional resources to figure out how to genuinely change the work environment in which employees operate and make it easier for them to do their job in a secure practical manner. Managers should implement a circular, collaborative approach to creating a frictionless, safer environment, working positively and without blame.


The decline of the user interface

The Ok and Cancel buttons played important roles. A user might go to a Settings dialog, change a bunch of settings, and then click Ok, knowing that their changes would be applied. But often, they would make some changes and then think “You know, nope, I just want things back like they were.” They’d hit the Cancel button, and everything would reset to where they started. Disaster averted. Sadly, this very clear and easy way of doing things somehow got lost in the transition to the web. On the web, you will often see Settings pages without Ok and Cancel buttons. Instead, you’re expected to click an X in the upper right to make the dialog close, accepting any changes that you’ve made. ... In the newer versions of Windows, I spend a dismayingly large amount of time trying to get the mouse to the right spot in the corner or edge of an application so that I can size it. If I want to move a window, it is all too frequently difficult to find a location at the top of the application to click on that will result in the window being relocated. Applications used to have a very clear title bar that was easy to see and click on.


Lawmakers paint grim picture of US data privacy in defending APRA

At the center of the debate is the American Privacy Rights Act (APRA), the push for a federal data privacy law that would either simplify a patchwork of individual state laws – or run roughshod over existing privacy legislation, depending on which state is offering an opinion. While harmonizing divergent laws seems wise as a general measure, states like California, where data privacy laws are already much stricter than in most places, worry about its preemptive clauses weakening their hard-fought privacy protections. Rodgers says APRA is “an opportunity for a reset, one that can help return us to the American Dream our Founders envisioned. It gives people the right to control their personal information online, something the American people overwhelmingly want,” she says. “They’re tired of having their personal information abused for profit.” From loose permissions on sharing location data to exposed search histories, there are far too many holes in Americans’ digital privacy for Rodgers’ liking. Pointing to the especially sensitive matter of childrens’ data, she says that “as our kids scroll, companies collect nearly every data point imaginable to build profiles on them and keep them addicted. ...”


Picking an iPaaS in the Age of Application Overload

Companies face issues using proprietary integration solutions, as they end up with black-box solutions with limited flexibility. For example, the inability to natively embed outdated technology into modern stacks, such as cloud native supply chains with CI/CD pipelines, can slow down innovation and complicate the overall software delivery process. Companies should favor iPaaS technologies grounded in open source and open standards. Can you deploy it to your container orchestration cluster? Can you plug it into your existing GitOps procedures? Such solutions not only ensure better integration into proven QA-tested procedures but also offer greater freedom to migrate, adapt and debug as needs evolve. ... As organizations scale, so too must their integration solutions. Companies should avoid iPaaS solutions offering only superficial “cloud-washed” capabilities. They should prioritize cloud native solutions designed from the ground up for the cloud, and that leverage container orchestration tools like Kubernetes and Docker Swarm, which are essential for ensuring scalability and resilience.
Shifting left is a cultural and practice shift, but it also includes technical changes to how a shared testing environment is set up. ... The approach scales effectively across engineering teams, as each team or developer can work independently on their respective services or features, thereby reducing dependencies. While this is great advice, it can feel hard to implement in the current development environment: If the process of releasing code to a shared testing cluster takes too much time, it doesn’t seem feasible to test small incremental changes. ... The difference between finding bugs as a user and finding them as a developer is massive: When an operations or site reliability engineer (SRE) finds a problem, they need to find the engineer who released the code, describe the problem they’re seeing, and present some steps to replicate the issue. If, instead, the original developer finds the problem, they can cut out all those steps by looking at the output, finding the cause, and starting on a fix. This proactive approach to quality reduces the number of bugs that need to be filed and addressed later in the development cycle.



Quote for the day:

"The best and most beautiful things in the world cannot be seen or even touched- they must be felt with the heart." -- Helen Keller

Daily Tech Digest - November 29, 2022

Cloud-Native Goes Mainstream as CFOs Seek to Monitor Costs

There's interest from the CFO organization in third-party tools for cloud cost management and optimization that can give them a vendor-neutral tool, especially in multicloud environments, according to Forrester analyst Lee Sustar. "The cost management tools from cloud providers are generally fine for tactical decisions on spending but do not always provide the higher level views that the CFO office is looking for," he added. As organizations move to a cloud-native strategy, Sustar said the initiative will often come from the IT enterprise architects and the CTO organization, with backing from the office of the CIO. "Partners of various sorts are often needed in the shift to cloud-native, as they help generalize the lessons from the early adopters," he noted. "Today, organizations new to the cloud are focused not on lifting and shifting existing workloads alone, but modernizing on cloud-native tech. Multicloud container platform vendors offer a more integrated approach that can be tailored to different cloud providers, Sustar added.


Financial services increasingly targeted for API-based cyberattacks

APIs are a core part of how financial services firms are changing their operations in the modern era, Akamai said, given the growing desire for more and more app-based services among the consumer base. The pandemic merely accelerated a growing trend toward remote banking services, which led to a corresponding growth in the use of APIs. With every application and every standardization of how various app functions talk to one another, which creates APIs, the potential target surface for an attacker increases, however. Only high-tech firms and e-commerce companies were more heavily targeted via API exploits than the financial services industry. “Once attackers launch web applications attacks successfully, they could steal confidential data, and in more severe cases, gain initial access to a network and obtain more credentials that could allow them to move laterally,” the report said. “Aside from the implications of a breach, stolen information could be peddled in the underground or used for other attacks. This is highly concerning given the troves of data, such as personal identifiable information and account details, held by the financial services vertical.”


The future of cloud computing in 2023

Gartner research estimates that we exceeded one billion knowledge workers globally in 2019. These workers are defined as those who need to think creatively and deliver conclusions for strategic impact. These are the very people that cloud technology was designed to facilitate. Cloud integrations in many cases can be hugely advanced and mature from an operational standpoint. Businesses have integrated multi-cloud solutions, containerization and continuously learning AI/ML algorithms to deliver truly cutting-edge results, but those results are often not delivered at the scale or speed necessary to make split-second decisions needed to thrive in today’s operating environment. For cloud democratization to be successful, companies need to upskill their knowledge workers and upskill them with the right tools needed to deliver value from cloud analytics. Low-code and no-code tools reduce the experiential hurdle needed to deliver value from in-cloud data, whilst simultaneously delivering on the original vision of cloud technology — giving people the power they need to have their voices heard.


What Makes BI and Data Warehouses Inseparable?

Every effective BI system has a potent DWH at its core. Just because a data warehouse is a platform used to centrally gather, store, and prepare data from many sources for later use in business intelligence and analytics. Consider it as a single repository for all the data needed for BI analyses. Historical and current data are kept structured, ideal for sophisticated querying in a data analytics DWH. Once connected, it produces reports with forecasts, trends, and other visualizations that support practical insights using business intelligence tools. ETL (extract, transform, and load) tools, a DWH database, DWH access tools, and reporting layers are all parts of the business analytics data warehouse. These technologies are available to speed up the data science procedure and reduce or completely do away with the requirement for creating code to handle data pipelines. The ETL tools assist in data extraction from source systems, format conversion, and data loading into the DWH. Structured data for reporting is stored and managed by the database component. 


Covering Data Breaches in an Ethical Way

Ransomware and extortion groups usually publicly release stolen data if a victim doesn't pay. In many cases, the victim organization hasn't publicly acknowledged it has been attacked. Should we write or tweet about that? ... These are victims of crime, and not every organization handles these situations well, but the media can make it worse. Are there exceptions to this rule? Sure. If an organization hasn't acknowledged an incident but numerous media outlets have published pieces, then the incident could be considered public enough. But many people tweet or write stories about victims as soon as their data appears on a leak site. I think that is unfair and plays into the attackers' hands, increasing pressure on victims. Covering Cybercrime Sensitively Using leaked personal details to contact people affected by a data breach is a touchy area. I only do this in very limited circumstances. I did it with one person in the Optus breach. The reason was at that point there were doubts about if the data had originated with Optus. The person also lived down the road from me, so I could talk to them in person.


EU Council adopts the NIS2 directive

NIS2 will set the baseline for cybersecurity risk management measures and reporting obligations across all sectors that are covered by the directive, such as energy, transport, health and digital infrastructure. The revised directive aims to harmonise cybersecurity requirements and implementation of cybersecurity measures in different member states. To achieve this, it sets out minimum rules for a regulatory framework and lays down mechanisms for effective cooperation among relevant authorities in each member state. It updates the list of sectors and activities subject to cybersecurity obligations and provides for remedies and sanctions to ensure enforcement. The directive will formally establish the European Cyber Crises Liaison Organisation Network, EU-CyCLONe, which will support the coordinated management of large-scale cybersecurity incidents and crises. While under the old NIS directive member states were responsible for determining which entities would meet the criteria to qualify as operators of essential services, the new NIS2 directive introduces a size-cap rule as a general rule for identification of regulated entities.


Cybersecurity: How to do More for Less

When assessing your existing security stack, several important questions need to be asked: Are you getting the most out of your tools? How are you measuring their efficiency and effectiveness? Are any tools dormant? And how much automation is being achieved? The same should be asked of your IT stack–is there any bloat and technical debt? Across your IT and security infrastructure, there are often unnecessary layers of complexity in processes, policies and tools that can lead to waste. For example, having too many tools leads to high maintenance and configuration overheads, draining both resources and money. Similarly, technologies that combine on-premises infrastructure and third-party cloud providers require complex management and processes. IT and cybersecurity teams, therefore, need to work together with a clear shared vision to find ways to drive efficiency without reducing security. This requires clarity over roles and responsibilities between security and IT teams for asset management and deployment of security tools. It sounds straightforward but often is not, due to historic approaches to tool rollout.


Being Agile - A Success Story

To better understand the Agile methodology and its concepts, it is crucial to understand the Waterfall methodology. Waterfall is another famous Software Development Life Cycle (SDLC) methodology. This methodology is a strict and linear approach to software development. It aims at a significant project outcome. On the other hand, Agile methodology is an iterative method that delivers results in short intervals. Agile relies on integrating a feedback loop to drive the next iteration of work. The diagram below describes other significant differences between these methodologies. In Waterfall, we define and fix the scope and estimate the resources and time to complete the task. In Agile, the time and resources are fixed (called an "iteration"), and the work is estimated for every iteration. Agile helps estimate and evaluate the work that brings value to the product and the stakeholders. It is always a topic of debate as to which methodology to use for a project. Some projects are better managed with Waterfall, while others are an excellent fit for Agile. 


User Interface Rules That Should Never Be Overlooked

The most important user interface design rule that should never be overlooked is the rule of clarity. Clarity is critical when it comes to user interfaces, says Zeeshan Arif, founder and CEO of Whizpool, a software and website development company. “When you're designing an interface, you need to make sure your users understand what they can do at all times,” Arif advises. This means making sure that buttons are correctly labeled and that there aren't any unexpected changes or surprises that might confuse users. “If a button says ‘delete’, then it should delete whatever it's supposed to delete -- and only that thing,” he says. “If you have a button that does something else, then either make it a different color or label it differently, but don't put in something in that looks like a delete button but doesn't actually delete anything.” Don't perplex users by designing a user interface crammed with superfluous options and/or features. “If you have too many buttons on one page, and none of them are labeled well enough for someone who isn't familiar with them, [users will] probably just give up before they even get started using your product, service, app, or website,” Arif says.


6 non-negotiable skills for CIOs in 2023

CIOs need to think about both internal integrations and external opportunities. They need to have strong relationships and be able to pull the business leaders together. For example, I’m working with an entrepreneurial organization that runs different lines of businesses that are very strong, with heads of those businesses who are also very strong. One of their challenges, however, is that their clients can be customers of multiple businesses. Between the seams, the client experiences the organizational structure of the business, which is a problem – a client should never experience your organizational structure. The person best equipped to identify and close those seams and integration points is the CIO. ... In the past, most organizations operated with a business group that sat between technology and the clients. The movement around agile, however, has knocked those walls down and today allows IT to become client-obsessed – we’re cross-functional teams that are empowered and organized around business and client outcomes. As a CIO, you need to spend time with clients and have a strong internal mission, too. You have to develop great leaders and motivate and engage an entire organization.



Quote for the day:

"A leader has the vision and conviction that a dream can be achieved._ He inspires the power and energy to get it done." -- Ralph Nader

Daily Tech Digest - February 25, 2020

5G's impact: Advanced connectivity, but terrifying security concerns


Despite the enthusiasm, professionals are also concerned about some of the negative aspects of 5G, specifically security and cost. The top barriers to adopting 5G in the next three years included security concerns (35%) and upfront investment (31%), the report found. The relationship between 5G and security is complex. Overall, the majority of respondents (68%) do believe 5G will make their businesses more secure. However, security challenges are also inherent to the network infrastructure, according to the report. These concerns involve user privacy (41%), the number of connected devices (37%), service access (34%), and supply chain integrity (29%). On the connected devices front, some 74% of respondents said they are worried that having more connected devices will bring more avenues for data breaches. With that said, the same percentage of respondents understand that adopting 5G means they will need to redefine security policies and procedures. To prepare for both security and cost challenges associated with 5G, the report recommended users seek external help. The partners businesses will most likely work with include software and services companies (44%), cloud companies (43%), and equipment providers (31%). 


What if 5G fails? A preview of life after faith in technology


"If it gets to a point where it's a broad decoupling of the developed from the emerging economies," said Sec. Lew, "that's not good for anyone. The growth of emerging economies would not be very impressive if they didn't have very active, robust trading relationships with developed economies. And the costs in developed economies would go up considerably, which means that the impact on consumers would be quite dramatic." "We know, from the early days when there was CDMA and GSM," remarked Greg Guice, senior vice president at Washington, DC-based professional consultancy McGuireWoods, "that made it very difficult to sell equipment on a global basis. That not only hurt consumers, but it hurt the pace of technology." He continued: I think what the companies that are building the equipment, and seeking to deploy the equipment, are trying to figure out is, in a world where there may be fragmentation, how do we manage this? I don't see people Balkanizing into their own camps; I think everybody is trying to preserve, as best they can, international harmonization of a 5G platform. Those efforts are in earnest.


Greenpeace takes open-source approach to finish web transformation


“The vision is to help people take action on behalf of the planet,” said Laura Hilliger, a concept architect at Greenpeace who is a leading member of the Planet 4 project. “We want to provide a space that helps people understand how our ecological endeavours are successful, and to show that Greenpeace’s work is successful because of people working collectively.” She met Red Hat representatives after work was already underway on the project in May 2018, which culminated in consultants, technical architects and designers from the company coming in to do a “design sprint” with Greenpeace exactly a year later. This helped Red Hat better understand Planet 4 users and how they interact with the platform, as well as the challenges of integration and effectively visualising data. Hilliger said variations in the tech stacks deployed across Greenpeace’s 27 national and regional offices, on top of its 50-plus websites and platforms, had created a complex data landscape that made integrations difficult.


Evolution of the data fabric


Personally, the fabric concept also began to change my thinking when discussing infrastructure design, for too long it was focussed on technology, infrastructure and location, which would then be delivered to a business upon which they would place their data. However, the issue with this was the infrastructure could then limit how we used our data to solve business challenges. Data fabric changes that focus, building our strategy based on our data and how we need to use it, a focus on information and outcomes, not technology and location. Over time as our data strategies evolved with more focus on data and outcomes, it became clear that a consistent storage layer while a crucial part of a modern data platform design, does not in itself deliver all we need. A little while ago I wrote a series of articles about Building a Modern Data Platform which described how a platform is multi-layered, requiring not just consistent storage but also must be intelligent enough to understand our data as it is written and provide insight, apply security and do these things immediately across our enterprise.


Legal Tech May Face Explainability Hurdles Under New EU AI Proposals


Horrigan noted the transparency language in the European Commission’s proposal is similar to the transparency principles outlined in the EU’s General Data Protection Regulation (GDPR). While the European Commission is still drafting its AI regulations, legal tech companies have fallen under the scope of the GDPR since mid-2018. Legal tech companies have also fielded questions regarding predictive coding’s accuracy and transparency with technology-assisted review (TAR), Horrigan added. TAR has become increasingly accepted by courts after then-U.S. Magistrate Judge Andrew Peck of the Southern District of New York granted the first approval of TAR in 2012. In Peck’s order, he discussed predictive coding’s transparency that provides clarity regarding AI-powered software’s “black box.” “We’ve addressed the black box before with technology-assisted review and we will do it again with other forms of artificial intelligence. The black box issue can be overcome,” Horrigan said. However, Hudek disagreed. While Hudek said the proposed regulation doesn’t make him hesitant to develop new AI-powered features to his platform, it does make it more challenging.


Thinking About ‘Ethics’ in the Ethics of AI

Thinking_about_Ethics_in_the_Ethics_of_AI_Judith_Simon
Ethics by Design is “the technical/algorithmic integration of reasoning capabilities as part of the behavior of [autonomous AI]”. This line of research is also known as ‘machine ethics’. The aspiration of machine ethics is to build artificial moral agents, which are artificial agents with ethical capacities and thus can make ethical decisions without human intervention. Machine ethics thus answers the value alignment problem by building autonomous AI that by itself aligns with human values. To illustrate this perspective with the examples of AVs and hiring algorithms: researchers and developers would strive to create AVs that can reason about the ethically right decision and act accordingly in scenarios of unavoidable harm. Similarly, the hiring algorithms are supposed to make non-discriminatory decision without human intervention. Wendell Wallach and Colin Allen classified three types of approaches to machine ethics in their seminal book Moral machines.


Cisco goes to the cloud with broad enterprise security service

cloud security expert casb binary cloud computing cloud security by metamorworks getty
Cisco describes the new SecureX service as offering an open, cloud-native system that will let customers detect and remediate threats across Cisco and third-party products from a single interface. IT security teams can then automate and orchestrate security management across enterprise cloud, network and applications and end points. “Until now, security has largely been piecemeal with companies introducing new point products into their environments to address every new threat category that arises,” wrote Gee Rittenhouse senior vice president and general manager of Cisco’s Security Business Group in a blog about SecureX. “As a result, security teams that are already stretched thin have found themselves managing massive security infrastructures and pivoting between dozens of products that don’t work together and generate thousands of often conflicting alerts. In the absence of automation and staff, half of all legitimate alerts are not remediated.” Cisco pointed to its own 2020 CISO Benchmark Report, also released this week, as more evidence of the need for better, more tightly integrated security systems.


Evolution of Infrastructure as a Service


Some would say that IaaS, SaaS, and PaaS are part of a family tree. SaaS is one of the more widely known as-a-service models where cloud vendors host the business applications and then deliver to customers online. It enables customers to take advantage of the service without maintaining the infrastructure required to run software on-premises. In the SaaS model, customers pay for a specific number of licenses and the vendor manages the behind-the-scenes work. The PaaS model is more focused on application developers and providing them with a space to develop, run, and manage applications. PaaS models do not require developers to build additional networks, servers or storage as a starting point to developing their applications. ... IaaS is now enabling more disruption across all markets and industries as the same capabilities available to larger companies are now also available to the smallest startup in a garage. This includes advances in AI and Machine Learning (as a service), data analytics, serverless technologies, IoT and much more. This is also requiring large companies to behave as agile as a startup.


AI Regulation: Has the Time Arrived?


Karen Silverman, a partner at international business law firm Latham & Watkins noted that regulation risks include stifling beneficial innovation, the selection of business winners and losers without any basis, and making it more difficult for start-ups to achieve success. She added that ineffective, erratic, and uneven regulatory efforts or enforcement may also lead to unintended ethics issues. "There's some work [being done] on transparency and disclosure standards, but even that is complicated, and ... to get beyond broad principles, needs to be done on some more industry- or use-case specific basis," she said. "It’s probably easiest to start with regulations that take existing principles and read them onto new technologies, but this will leave the challenge of regulating the novel aspects of the tech, too." On the other hand, a well-designed regulatory scheme that zeros-in on bad actors and doesn't overregulate the technology would likely mark a positive change for AI and its supporters, Perry said.


Functional UI - a Model-Based Approach


User interfaces are reactive systems which are specified by the relation between the events received by the user interface application and the actions the application must undertake on the interfaced systems. Functional UI is a set of implementation techniques for user interface applications which emphasizes clear boundaries between the effectful and purely functional parts of an application. User interfaces' behavior can be modelized by state machines, that, on receiving events, transition between the different behavior modes of the interface. A state machine model can be visualized intuitively and economically in a way that is appealing to diverse constituencies (product owner, testers, developers), and surfaces design bugs earlier in the development process. Having a model of the user interface allows to auto-generate both the implementation and the tests for the user interface, leading to more resilient and reliable software. Property-based testing and metamorphic testing leverage the auto-generated test sequences to find bugs without having to define the complete and exact response of the user interface to a test sequence. Such testing techniques have found 100+ new bugs in two popular C compilers (GCC and LLVM)




Quote for the day:


"There is no 'one' way to be a perfect leader, but there are a million ways to be a good one." -- Mark W. Boyer


Daily Tech Digest - July 23, 2018

Most of AI’s Business Uses Will Be in Two Areas


The business areas that traditionally provide the most value to companies tend to be the areas where AI can have the biggest impact. In retail organizations, for example, marketing and sales has often provided significant value. Our research shows that using AI on customer data to personalize promotions can lead to a 1-2% increase in incremental sales for brick-and-mortar retailers alone. In advanced manufacturing, by contrast, operations often drive the most value. Here, AI can enable forecasting based on underlying causal drivers of demand rather than prior outcomes, improving forecasting accuracy by 10-20%. This translates into a potential 5% reduction in inventory costs and revenue increases of 2-3%. While applications of AI cover a full range of functional areas, it is in fact in these two cross-cutting ones—supply-chain management/manufacturing and marketing and sales—where we believe AI can have the biggest impact, at least for now, in several industries. Combined, we estimate that these use cases make up more than two-thirds of the entire AI opportunity.



How SD-WAN Will Make The Cloud Much Much Bigger

cloud balloon inflate cloud computing grow big blow up
The need to be connected to the mother ship is what brings the Cloud into its meaningful existence because we live and work at the edges of the Cloud. SD-WAN is not just a market but a platform as well that will eventually evolve into user-defined WAN (UD-WAN). To clarify, the term applies to enterprise users and not consumers. And the purpose of SD-WAN is to connect and fully integrate the very edges of the enterprise – be it corporate headquarters, branch/remote offices or the mobile millions. In other words, us, the users. But if we look at the concept of the cloud it is pretty clear that it is referenced in an abstract form. After all what is this cloud thing? Some physical space in a non-descript windowless warehouse? Without its tentacles, the cloud is nothing more than a collection of computers, storage and cooling systems created by geeks and for what purpose? It is those very tentacles in the form of wide-area networks (WAN) that give the Cloud its purpose. And given the explosive adoption of cloud-based applications (Box, Dropbox, Salesforce, SAP, Slack, etc.) cloud computing is not a fad, it is here to stay. However, that is just the beginning.


The value of superior UX? Priceless, but awfully hard to measure

The problem, Cooper continues, is that managers and executives outside of the bubble remain skeptical about investing any more than they have to in UX -- to them, it's a dark art. So, they ask: "What is the ROI of UX?" Asking about ROI, of course, is a manager's way of expressing doubts. "They aren't seeking enlightenment," Cooper says. ... In UX design, he continues "ROI is often about eliminating poor design." Some industry specialists have attempted to put a monetary value on superior UX design. A recent report from CareerFoundry estimates that UX design work delivers a 100-fold return on investment, without even counting the soft benefits. Every $1 investment in UX translates to returns of at least $100 dollars, the report's authors illustrate -- mainly through e-commerce and customer-facing interactions. Add to this the softer, but just as important, ancillary benefits: "fewer support calls, increased customer satisfaction, reduced development waste, and lower risk of developing the wrong idea."


Why techmatters – the challenge for everyone in the UK tech community


If there is a magic recipe for digital innovation, then the UK surely has all the ingredients. We have created and attracted some of the world’s best and most diverse digital talent. We have world-leading businesses, universities and powerful ecosystems that enable expertise to spill over from one part of the economy to another. In almost every sector, I can point to world leaders on the cutting edge of digital transformation. Above all, we have ambition and we have each other. What sets us apart from any other country is that in the UK technology community, we stand on the shoulders of each other. But to really thrive, three things are important. We must stay focused on making tech work for people and our economy. We must not underestimate our international competitors. And, perhaps most importantly, we must accept the enormous responsibility that comes with developing powerful technology. We do have great people in this sector – but we simply don’t have enough of them. And we don’t have the depth of skills and talent that the economy needs as a whole. This, surely, is our biggest challenge.


Why Artificial Intelligence Is Not a Silver Bullet for Cybersecurity

While AI is likely to work quite well over a strictly controlled network, the reality is much more colorful and much less controlled. AI's Four Horsemen of the Apocalypse are the proliferation of shadow IT, bring-your-own-device programs, software-as-a-service systems, and, as always, employees. Regardless of how much big data you have for your AI, you need to tame all four of these simultaneously — a difficult or near-impossible task. There will always be a situation where an employee catches up on Gmail-based company email from a personal laptop over an unsecured Wi-Fi network and boom! There goes your sensitive data without AI even getting the chance to know about it. In the end, your own application might be protected by AI that prevents you from misusing it, but how do you secure it for the end user who might be using a device that you weren't even aware of? Or, how do you introduce AI to a cloud-based system that offers only smartphone apps and no corporate access control, not to mention real-time logs? There's simply no way for a company to successfully employ machine learning in this type of situation.


Unsecured server exposes 157 GB of highly sensitive data from Tesla, Toyota and more

data breach, Level One, Tesla, Toyota, Ford
The unsecured trade secrets and corporate documents had been exposed via the file transfer protocol rsync. UpGuard wrote, “The rsync server was not restricted by IP or user, and the data set was downloadable to any rsync client that connected to the rsync port. The sheer amount of sensitive data and the number of affected businesses illustrate how third- and fourth-party supply chain cyber risk can affect even the largest companies. The automation and digitization of manufacturing has transformed the industry, but it has also created a new area of concern for industries, and one that must be taken seriously for organizations to thrive in a healthy digital ecosystem.” Not only could anyone connect to Level One’s rsync server, but it was also “publicly writable, meaning that someone could potentially have altered the documents there, for example replacing bank account numbers in direct deposit instructions, or embedding malware.” The exposed rsync server was discovered on July 1. Attempts to contact Level One started on July 5, but contact wasn’t established until July 9. The exposure was closed within a day, by July 10.


Organizations Need IT Experts Who Know Basic LAN/WAN Switching and Routing

Responsiveness, security, and reliability are the new hallmarks of networking. Automation, analytics, IoT, policy-based network management, programmability, and virtualization are enabling these changes. The technologies, and the ways they’re being applied, are new. So, IT and networking professionals need new skills to make them work for businesses. In order to appeal to hiring managers, boost their careers, and bring greater value to employers there are fundamental skills that IT and networking professionals need. At a very fundamental level, it’s critical that IT experts know the basics of LAN and WAN switching and routing. These skills will help network engineers configure, verify, troubleshoot, and secure today’s networks. In addition, the evolution of the network creates a growing need for IT professionals who can implement and manage software-centric networks. This involves using APIs, controllers, policies, and virtualization. These technologies and tools allow for greater automation, network intelligence, and agility.


The Engineer’s guide to the future


If AR is hyped, AI is basically the buzzword of the century. Lots of people aren’t really sure what it means, but they know it’s important and that their business needs it. The first thing to know is that modern day Artificial Intelligence doesn’t actually mean a computer being intelligent — it’s basically a catch-all term for computer programs that can “learn”, to improve their operational efficiency or their success. Even at that, lots of applications that say they use AI actually don’t. A chatbot that has a big decision tree in the background isn’t AI, it’s just a big decision tree. If you ask “What is Ragnarok?” and get back the answer “It is simultaneously a great action movie and the ruin of a good character” — it’s probably not artificial intelligence, just quite wise. However, there is plenty of amazing work being done with proper AI and Machine Learning, for a whole heap of use-cases. We don’t need a crystal ball to say that knowing about AI will be beneficial for a future engineering career. Similar to Apple and Google releasing tools to “democratise” Augmented Reality development, each year there are more tools available to enable developers to build AI solutions


In the wake of GDPR, college IT security programs need to evolve

While U.S. universities who offer information security programs typically cover a range of compliance concepts related to U.S. regulations, such as the Health Insurance Portability and Accountability Act (HIPAA) or Sarbanes-Oxley (SOX), the GDPR is something of a game changer because it is not a regulation enacted by a U.S. agency, yet it requires compliance on the part of U.S. entities. The GDPR is only the first of several proposed global regulations governing data privacy. Before 2015, data exchanges between the U.S. and the EU were governed by the Safe Harbor program which allowed the personal data of EU citizens to be exchanged with U.S. providers as long as both sides of the transaction complied loosely with the EU Data Protection Directive. The directive wasn’t as tightly defined as the GDPR and lacked teeth in the form of significant fines or penalties. As a result, up to this point in time, U.S. businesses have not had to unduly concern themselves with regulations enacted outside U.S. borders. GDPR demands a change in that mindset.


Can businesses use blockchain to solve the problem of data management?

Can businesses use blockchain to solve the problem of data management? image
Since the nodes are distributed and operate peer-to-peer, the possibility of bottleneck formation is nonexistent. One of the most important features of blockchain systems, however, is immutability: once an entry is appended to the database, it cannot be removed. Using blockchain for databases seems like a logical step forward. There’s definitely an emerging movement seeking to lay the foundations for a decentralised architecture across industries. With blockchain, a marketplace akin to AirBnB or Uber can materialise for storing data – nodes on the network can be incentivised to replicate and retain information using a blockchain protocol’s inbuilt payment layer. This concept can be taken a step further with the use of sharding and swarming. Sharding offers a greater degree of privacy whereby, instead of sending a file to other nodes, you distribute fragments of said file. In this way, the owner can be sure that those in possession of their data cannot access it, as they will only hold a small (and unreadable) piece – much like torrenting.



Quote for the day:


"Authentic leaders are not afraid to make mistakes, but they fix them faster than they make them." -- George Bernard Shaw


Daily Tech Digest - July 22, 2018

nullBy reducing manual intervention, automated processes can minimise mistakes and human error – but there is still the chance that something can go wrong. Designers of automated processes need to ensure that the appropriate quality outcomes are being measured and assessed against a given specification. Importantly, this must happen throughout the entire process. Let’s think about the car production line again. The cost of finding out that something went wrong at the start of the production process after the car has been built is significant. Instead, process designers will want to identify errors quickly and allow the process to make the necessary changes to ensure a quality product is delivered.  A significant quantity of data is generated through automated processes, but the quantity of data does not compensate for the quality of the data. In order to deliver a quality product at the end of an automated process, a quality data management process is critical. But what is bad data? And, if everything is being automated anyway, why should we care?


Python has brought computer programming to a vast new audience


Not all Pythonistas are so ambitious, though. Zach Sims, Codecademy’s boss, believes many visitors to his website are attempting to acquire skills that could help them in what are conventionally seen as “non-technical” jobs. Marketers, for instance, can use the language to build statistical models that measure the effectiveness of campaigns. College lecturers can check whether they are distributing grades properly. For professions that have long relied on trawling through spreadsheets, Python is especially valuable. Citigroup, an American bank, has introduced a crash course in Python for its trainee analysts. A jobs website, eFinancialCareers, reports a near-fourfold increase in listings mentioning Python between the first quarters of 2015 and 2018. The thirst for these skills is not without risk. Cesar Brea, a partner at Bain & Company, a consultancy, warns that the scariest thing in his trade is “someone who has learned a tool but doesn’t know what is going on under the hood”. Without proper oversight, a novice playing with AI libraries could reach dodgy conclusions.


Top 10 Data Science Use Cases in Insurance


The customers are always willing to get personalized services which would match their needs and lifestyle perfectly well. The insurance industry is not an exception in this case. The insurers face the challenge of assuring digital communication with their customers to meet these demands. Highly personalized and relevant insurance experiences are assured with the help of the artificial intelligence and advanced analytics extracting the insights from a vast amount of the demographic data, preferences, interaction, behavior, attitude, lifestyle details, interests, hobbies, etc. The consumers tend to look for personalized offers, policies, loyalty programs, recommendations, and options. The platforms collect all the possible data to define the major customers` requirements. After that, the hypothesis on what will work or won`t work is made. ... Modern technologies have brought the promotion of products and services to a qualitatively new level. Different customers tend to have specific expectations for the insurance business. Insurance marketing applies various techniques to increase the number of customers and to assure targeted marketing strategies. In this regard, customer segmentation proves to be a key method.


The Evolution Of Data


Traditionally, a platform was used to address an enterprise process workflow — human resources (HR), finance, manufacturing, etc. They are what we categorize as enterprise resource planning (ERP), customer relationship management (CRM), human capital management (HCM), functional setup manager (FSM), information technology operations (ITOps), etc. The data generated by these workflows was then analyzed using analytics or business intelligence applications to make further modifications to workflow. These workflow applications were customized as the data warranted any changes in the workflow. ... The workflow actions will be passed on to the traditional applications or directly to the people or system that will perform the actions. These new systems of intelligence will emerge and will force existing workflow applications to change to be end-user targeted. We are already seeing a trend where AI platforms are slowly becoming a playground for new intelligent applications. More importantly, because open source intelligent platforms in this area are as rich as the enterprise platforms, we are also noticing new generations of applications.


6 trends that are changing the face of UX

An flat, vector-style illustration showing various elements of UX including wireframing and featuring services whose UX is being improved, such as social media companies and driverless cars.
A pattern library acts as a centralised hub for all components of the user interface. Effective pattern libraries provide pattern descriptions, annotations and contextual information. They also showcase the code and pattern variations, and have the ability to add real data into the pattern structure. Once a design system is up and running, it’s only the first step in the journey. It needs to be living. Nathan Curtis, a co-founder of UX firm EightShapes, says: “A design system isn’t a project. It’s a product, serving products.” Like any good product, a design system needs maintenance and improvements to succeed. Both Google and Salesforce have teams dedicated to improving their design systems. The goal is a workflow where changes to the design system update the documentation and the code. The benefits realised by a thoughtful, unified design system outweigh the effort involved in establishing one. There is a consistency across the entire user experience. Engineers and designers share a common language and systems are more sustainable. Designers can spend their time solving harder problems and the actual user experience.


What’s so special about 5G and IoT?

5G mobile wireless network
If we think about our current needs for IoT, what we care about are three things: price, coverage, and lower power consumption. But 5G is focused on increasing bandwidth, and while increased data transfer and speeds are nice, they are not entirely necessary for IoT products. The GSMA outlines 5G will possibly offer 1000x bandwidth per unit area. However, as they state in their own report, bandwidth per unit area is not dependent upon 5G, but more devices connecting with higher bandwidths for longer durations. While it is great that 5G aims to improve this service, the rollout of LTE has already had a significant effect on bandwidth consumption. We should be excited about continued incremental improvements on Cat-M1 and NB-IoT as we get even lower cost and lower power solutions for our IoT applications. Unlike LTE, 5G lacks a solid definition, which means cellular providers could eventually label a slightly-faster-than-LTE connection as 5G. And truly, the only thing that is certain about 5G is we won’t know what it can and cannot do until it arrives. 


Microsoft's Linux love-in continues as PowerShell Core comes to Ubuntu Snap Store

Evidence of that newfound affection has been evident throughout 2018: with Ubuntu 18.04 being made available in the Microsoft Store, Windows File Explorer gaining the ability to launch a Linux Shell and a new option to install Windows Subsystem for Linux (WSL) distros from the command line. That's without mentioning Microsoft's release of the Linux-based Azure Sphere operating system. Now Microsoft has released its command-line shell and scripting language PowerShell Core for the Ubuntu Snap Store, as part of PowerShell Core's release as a snap package. Snap packages are containerized applications that can be installed on many Linux distributions, which Joey Aiello, PM for PowerShell at Microsoft, says has several advantages. "Snap packages carry all of their own dependencies, so you don't need to worry about the specific versions of shared libraries installed on your machine," he said, adding updates to Snaps happen automatically, and are "safe to run" as they don't interact with other applications or system files without your permission.


Why Design Thinking Should Also Serve As A Leadership Philosophy

The key here, from a leadership standpoint, is simply to drop the ego. Sweep aside titles and preconceptions about where audience insight should come from. Instead of defaulting to traditional techniques for collecting customer insight, seek it out wherever you can. Find the people who are best equipped to provide an insider's look at your customers' preferences and dislikes, whether those people are sitting in a focus group or across from you on the subway, so you can be sure you'll be giving your customers exactly what they want. Adopting a human-centric mindset can help you turn even the most fragmented experiences into seamless interactions between customer and brand. It's an investment in the customer journey that can build long-term loyalty and trust. Often, dissecting the user experience also reveals new product markets, audience segments and customer service platforms that can lead to future growth. When you consider what's at the heart of your business problem and break down the barriers between your company and your customers, it quickly becomes clear that design thinking can alter your leadership approach for the better.


Managing Engineering Complexity: Are You Ready?

So, here is the complexity loop we are in: customers demanding more capabilities leads to more complexity in IoT systems, constantly feeding data into the development processes, leading to new security and safety standards requirements, new use cases, and the need to adapt fast to changes that companies cannot always predict. These actions lead to the demand for even more complex IoT systems. And, with these new changes, new customers’ demands arise and the loop continues perpetually. Let’s zoom in for a second and see what that means for one of the most exciting industries today – automotive engineering, i.e., how we build a car. What characterizes the OEM leaders today is the desire for speed in product development and a capability of overcoming the complexity of connecting requirements, design, development, validation, and deployment within their engineering process and throughout their supply chain. And how they do that?


The Role of Randomization to Address Confounding Variables in Machine Learning

Machine learning practitioners are typically interested in the skill of a predictive model and less concerned with the statistical correctness or interpretability of the model. As such, confounding variables are an important topic when it comes to data selection and preparation, but less important than they may be when developing descriptive statistical models. Nevertheless, confounding variables are critically important in applied machine learning. The evaluation of a machine learning model is an experiment with independent and dependent variables. As such, it is subject to confounding variables. What may be surprising is that you already know this and that the gold-standard practices in applied machine learning address this. ... Randomization is a simple tool in experimental design that allows the confounding variables to have their effect across a sample. It shifts the experiment from looking at an individual case to a collection of observations, where statistical tools are used to interpret the finding.



Quote for the day:

"Leaders must be good listeners. It_s rule number one, and it_s the most powerful thing they can do to build trusted relationships." -- Lee Ellis

Daily Tech Digest - February 28, 2017

FinTech unleashed: Why banks and FinTech have a love-hate relationship

Banks, asset managers, wealth advisors and insurance companies once competed only in their silos. While they still do today, they also face competition from non-traditional market players with new skills, funding sources, and approaches. In the prolonged low-interest rate environment, many have been driven to use cost containment as the key to success in a more complex regulatory environment. Others are scrambling for top line growth (both organically and through acquisition) in a search for new revenue opportunities. Getting back to technology, the nature of the FinTech narrative over the past few years has been evolving. As well, the pace of technology change continues to accelerate. Rapidly evolving advances in artificial intelligence across chatbots, robo-advisors, claims, underwriting, IoT and soon blockchain, add another layer of potential to further shake-up the traditional business model.


Ransomware Getting More Targeted, Expensive

“Actors engaging in this targeting strategy are also charging ransoms based on the number of host (or servers) infected,” the FBI warned. “Additionally, recent victims who have been infected with these types of ransomware variants have not been provided the decryption keys for all their files after paying the ransom, and some have been extorted for even more money after payment.” According to the FBI, this recent technique of targeting host servers and systems “could translate into victims paying more to get their decryption keys, a prolonged recovery time, and the possibility that victims will not obtain full decryption of their files.” ... “People behind these scams seem to be setting different rates for different countries,” Abrams said. “Victims in the U.S. generally pay more than people in, say, Spain.


Digitization inches towards the mainstream

Most CIOs joke that their transformations are never truly complete as they embrace emerging technologies, including internet of things, artificial intelligence and blockchain, but some sectors are further along than others. Media and entertainment (62 percent), along with retail (55 percent) and high-tech (54 percent) tend to be ahead in their digitization efforts compared to sectors such as consumer packaged goods (31 percent), automotive (32 percent) and financial services (39 percent). Industries hovering in the digital media include healthcare (51 percent), telecom (44 percent) and professional services (42 percent). McKinsey also found that digitization levels vary by business operations. For example, 49 percent of survey respondents say customer-focused areas such as marketing and distribution are primary focuses of their digital strategies.


Stanford experts urge healthcare professionals to harness power of people’s mindsets

“It should be about designing a formal curriculum for medical school that weaves all of this throughout the training,” Leibowitz said. “So it’s not just mentioned in one or two classes or taught for one semester and then forgotten about.” The experts also called for a reform of standard randomized trials in the healthcare system. When examining the effects of a new drug, researchers should include natural conditions, which don’t use placebos, alongside conditions that include altered social context and mindset. This, Crum said, will help researchers understand how beliefs, labels and context can help magnify or reduce the effects of the drug and treatment. These reforms, however, would require additional rigorous research that builds more scientific evidence for the importance of the effects of social context and mindsets, they said.


IT orgs enlist startups to address container security concerns

Startups have begun to make a name for themselves with IT organizations, as their products address container security concerns. Network-based attacks and exploits on IT infrastructure aren't new, but container technology, popularized by Docker, demands a new way to address time-honored problems. For example, containers spin up and disappear far faster and more often than VMs, so container security policies must follow an ever-changing infrastructure. Containers also tend to rely on overlay networks, which can be difficult to visualize with traditional network monitoring tools. ... It's not uncommon for startups to pop up around new technologies, according to analysts, but there are pros and cons to trusting a startup's product as part of an IT infrastructure. A big pro for many large IT organizations is that they can play a part in shaping the roadmap of an early-stage vendor, and possibly an entire market space.


Artificial Intelligence: Removing The Human From Fintech

If and when AI becomes more prevalent in the fintech industry, the same will happen. This is the thing with technology, as sometimes it can seem as if the new system has taken again, years or decades in fact, to create, but for customers to adopt and more importantly, trust, the technology, it could take even longer. Alongside this, with films like Ex Machina coming out and showing society what could potentially happen, as Pesenti alluded to, the negativity surrounding AI could result in the service taking even longer to be adopted. On the other hand, the millennial generation seem to welcome and encourage new technology - cellphone apps are a perfect example of how quickly new systems can enter the marketplace, so it could be said that this is the area in which AI could potentially blossom.


Google Shifts On Email Encryption Tool Leaving Its Fate Unclear

The tool is designed to work as an extension to Google's Chrome browser that uses the OpenPGP standard to encrypt emails, ensuring that only the recipient can read them -- and not the email provider or a government. The main goal of Google's project was to make OpenPGP easier to use. It was announced amid growing scrutiny over U.S. surveillance efforts following disclosures from noted leaker Edward Snowden. However, the search giant hasn't made the extension officially available on its Chrome Web Store. Instead, the project's source code has only been made available on GitHub, a software collaboration site, making the extension harder to install, especially for non-technical users.


20 Cybersecurity Startups To Watch In 2017

In spite of a slowdown in the overall funding activity from venture capital firms in 2016, the cybersecurity market continued to raise money at full steam. Last year saw the market break records in terms of funding deals, with Q3 tallying up to be the most active quarter for deals in cybersecurity in the last five years, according to CBInsights. That influx of money is driving innovation in a number of areas. Particularly notable market segments targeted by these firms include security for data centers and public cloud infrastructure, security orchestration and incident response tools, and third-party risk assessment tools. The following 20 firms are primarily early- to middle-stage startups, with a few more mature start-ups that have courted growth equity to change course or expand into a particularly hot new market segment. We believe these firms are worth watching due to several factors.


Are You Over-Confident on Cyber Security Risks?

"Consumers vastly underestimate cybersecurity threats and don't know how to identify, respond or protect themselves from future attacks," said David Blumberg, founder and managing partner of Blumberg Capital. "Naiveté and arrogance are a really dangerous combination. The cybersecurity landscape is complex and ever-evolving. Bad actors are constantly finding new ways to bypass security measures to infiltrate confidential systems and steal information or sabotage infrastructure. Even experts can miscalculate how to mitigate risks and existing security solutions are no longer enough, especially in areas such as IoT or cloud security. At Blumberg Capital, we support companies at the forefront of innovation in cybersecurity. We partner with innovative startups creating new ways to minimize cybersecurity threats and protect personal, business and government information."


A Tale Of Two User Experiences

Although the products and software are of undoubted quality, what’s remarkable is the fit and finish of the process the user goes through. The selection cycle and the actual purchase steps are streamlined, taking into account how busy I am.  Then there’s the initial product experience, which is the box. Apple’s process dictates that I feel it and appreciate it before I open it. Really. But the fact that they want me to go through this tactile experience is an indication of how seriously they take the first impressions of their product, and the implied quality of every part of the product experience. In subsequent steps, they want me to touch the product and use the UI in low-risk interactions that provide the most non-threatening training experience. Even though migrating the old phone’s data and configuration had built-in complexity and potential for blind alleys, it didn’t feel like it.



Quote for the day:


"Encourage the small steps in order to see the big steps achieved." -- Gordon Tredgold