How to Write Your First Script in Selenium WebDriver 5

In this blog, you will be given an opportunity to get used to Selenium automation testing as well as the basic instructions on how to write a script in Selenium WebDriver with sample Java code to get you started.


The tutorial provides automation testers the basic knowledge of how to write the script in Selenium Webdriver and practical examples. The article contains enough ingredients to let you start with Selenium WebDriver from where you can independently take yourself to higher levels of expertise.


Before starting with this tutorial, you need to have a basic knowledge of Java or any other object-oriented coding language. Moreover, you should be proficient in the basic principles of testing concepts, especially the concept of Selenium Automation Testing.

Getting Started With Selenium WebDriver Scripting

Assume that you want to write the script in Selenium Webdriver that could:

  • Fetch Fox News’ homepage
  • Verify its title
  • Print out the result of the comparison
  • Close it before ending the entire program

WebDriver Code

Below is the Selenium WebDriver example for the logic presented by the scenario mentioned above.

Note: Gecko driver generated by Mozilla should be taken into account when using WebDriver. Selenium 3.0, Firefox, and Gecko have compatibility issues and setting them appropriately could become a difficult task. If the code cannot be activated, the lower version of Firefox should be downloaded. Otherwise, you can run your Selenium script on Chrome. You only need to change three lines of the code to let your script work with Firefox or Chrome.

Build Your First Serverless Website with Alibaba Cloud

Serverless is a prevalent buzzword among developers recently, but do you really know how to take full advantage of it? This tutorial will help you to create a small, serverless personal webpage by using Alibaba Cloud

What Is Serverless?

Serverless is a new computing paradigm in which you can build applications composed of microservices running as a response to events. Under this model, the services automatically scale according to the usage. This means that you only get charged when they are executed, becoming the most “pay-as-you-go” model ever. Of course, this reduces the overall cost of maintenance for your apps, enabling you to move on the logic, deploying faster.

The classic approach to publishing a website on your own is to have a web server, such as an Elastic Computer Service (ECS) instance, running non-stop. This means having an instance 24 hours a day, 7 days per week. Running the numbers, that is about 730 hours per month, between $5~$15 every month depending on the plan and, on top of that, all the security and software updates involved in managing a server. I’m sure you prefer to spend that time and money on something more enjoyable, like a day in the mountains instead.

What Will We Build in This Tutorial?

For the purpose of learning, we will keep things simple and clear by setting up a simple “About Me” page. For this, we will leverage Object Storage Service (OSS), API Gateway, Function Compute, and Table Store. To make that happen, the idea is to host a static HTML file in an OSS bucket with Public Read Permissions. This page will have a visit counter that will communicate via API Gateway with Function Compute, and keep the visits count on Table Store. Does it sound complex? Keep reading, you will find it easy.

Let’s Get Started

To start, create a folder for the project. Once inside it, create the file .env for the credentials and leave it there for now.

Install Fun

We said this tutorial will be based on “Fun.” Fun is an official Alibaba Cloud tool to arrange serverless applications resources. Therefore, we will use it to deploy our website backend by describing the resources in a template.yml file.

To install Fun, you only need to run in your terminal npm install @alicloud/fun -g. To verify that the installation went ok, type fun -h   and see if the output prints the Fun help.

The Do’s and Don’ts of Running Data Quality Projects

Data quality projects are becoming collaborative and team-driven. As organizations strive to accomplish their digital transformation initiatives, data professionals are realizing they need to work as a team with business operations so everyone has the quality data they need to succeed. Chief Data Officers need to master some simple but useful Dos and Don’ts about running their data quality projects.

Data Quality Do’s

Set Your Expectations From the Start

Start by connecting the data quality issues with business outcomes. For example, when a marketing team realizes that 20% of their activities will never reach their target due to data quality issues, they’ll be more likely to get on board with the data quality project. Keep in mind, however, that this is an ongoing process and that perfect data might never exist. Set intermediate goals, realistic expectations and make sure you measure each success.

Build Your Interdisciplinary Team

Data has become a serious business in digital transformation, and as a result, a growing number of people within different lines of business have become data-savvy. All of these people individually complain that they spend 80% of their time crunching the data before they can turn it into something useful. So, what if everyone combined their talents and worked as a team? This is your opportunity to make data a team sport. Establish a shared service with a data platform and bring onboard the digital marketing experts who struggle to reconcile the data coming from external channels. Additionally, Data Protection Officers need to make sure that the data in your brand-new cloud data warehouse is properly anonymized.

Deliver Quick Wins

While it’s key to stretch capabilities and set ambitious goals, it’s also necessary to prove that your data quality project will provide business value quickly. Don’t spend too much time on heavy planning. Instead, prove business impacts with immediate results. For example, what about organizing a “data clean-up day” with the sales and marketing team to apply quick fixes in your Salesforce or Marketo data? Once you have demonstrated how easy it is to get benefits, you gain credibility, and people will support your project, allowing you to move onto the bigger tasks.


Don’t Leave the Clock Running

More often than not, data quality is an afterthought. “Garbage in, garbage out” has become one of the most common mistakes that hampers the benefit of any IT or digital transformation initiative. By the time you realize you need to fix it, it’s too late – your data lake has become a data swamp. Taking control of your data has turned into a very tricky and expensive initiative. There’s no compromise in taking control of your data before you can share and process it.

Don’t Overengineer Your Projects, Making Them Too Complex and Sophisticated

It is tempting to go straight to the holy grail – often referred to as the single version of the truth. But is it what you really need and is your company ready to deliver? Some data management disciplines, such as Master Data Management, can bring your data quality standards to the highest level, but this requires a lot of effort, strong sponsorship, and authoritative approaches to governance. There are other initiatives, such as self-service data quality or data cataloging, that allow you to instill trust in data. Consider them as alternatives or as natural milestones that will help you step up in your maturity curve.

Don’t Sell a Data Quality Project

Data quality is a discipline. It requires methodology, tools and processes. But this doesn’t entice the lines of business to join in. Few people in a company care about a 360° view of customers, for example, unless they understand how it will boost marketing campaign efficiency, customer conversion rate, or time to support a customer claim. To succeed, your project must be widely known within your organization and linked to the concrete benefits it brings to the different activities. As you make it more specific to some activities, some might understand that their own activities were not in the scope. Guess what? They may well ask you to extend your projects and solve their data quality pains. This is a good reason to ask for more budget.


Data can drive insights, and it can drive the business, so we better trust it. That’s why data quality has become so important. The cost of doing nothing is rising. But it has become much easier to fix issues by informing people in the company of the value of data and training them to become data savvy. At the same time, technologies have become easier to use by anyone via self-service, smarter in their ability to capture data quality issues and automate their remediation, and more pervasive in their ability to bring data quality controls wherever they need to be. Are you ready for the challenge? 

Predictive Analytics for Applications

Seven out of 10 application teams plan to add predictive analytics to their products in the next 12 months. But building predictive algorithms and embedding insights in an application presents a number of technical and practical challenges.

In this on-demand webinar, we’ll explore best practices for building predictive applications and how to overcome the most common development challenges.

You’ll see:

  • What is predictive analytics and what are the most popular uses for SaaS applications
  • Common technical challenges for building predictive applications
  • Best practices for implementing predictive applications
  • A live demo of building a predictive model and embedding it in an application

Mathematica Geographic Functions and Spherical Triangles

This post will look at the triangle behind North Carolina’s Research Triangle using Mathematica’s geographic functions.

Spherical Triangles

A spherical triangle is a triangle drawn on the surface of a sphere. It has three vertices, given by points on the sphere, and three sides. The sides of the triangle are portions of great circles running between two vertices. A great circle is a circle of maximum radius, a circle with the same center as the sphere.

An interesting aspect of spherical geometry is that both the sides and angles of a spherical triangle are angles. Because the sides of a spherical triangle are arcs, they have angular measure, the angle formed by connecting each vertex to the center of the sphere. The arc length of a side is its angular measure times the radius of the sphere.

Denote the three vertices by AB, and C. Denote the side opposite A by a, etc. Denote the angles at AB, and Cby α, β, and γ respectively.

Research Triangle

Research Triangle is a (spherical!) triangle with vertices formed by Duke University, North Carolina State University, and University of North Carolina at Chapel Hill.

(That was the origin of the name, though it’s now applied more loosely to the general area around these three universities.)

We’ll take as our vertices

  • A = UNC Chapel Hill (35.9046 N, 79.0468 W)
  • B = Duke University in Durham (36.0011 N, 78.9389 W),
  • C = NCSU in Raleigh (35.7872 N, 78.6705 W)

Bots Are Here to Stay

Conversational bots are taking the tech world by the storm. Chatbots enable humans to converse with computers in their native language through a computer interface. The explosion of the app ecosystem, advancements in Artificial Intelligence (AI), cognitive technologies, fascination with conversational UI, and wider reach of automation are all driving the chatbot trend.

AI-powered messaging solutions or conversational bots are indeed a first stepping stone to enable enterprises to make faster, more informed decisions, become more efficient, and craft more relevant and personalized experiences for both customers and employees.

Given the fact that we are already interacting with intelligent virtual assistants such as Google Assistant, Apple’s Siri and Amazon’s Alexa, etc. in our daily lives — it’s important to understand what a conversational bot is, its various types, why conversational bots are needed, and what the future holds.

What Is a Conversational Bot?

It’s a computer program that works automatically and is skilled in communicating via various digital media that includes intelligent virtual agents, company websites, messenger platforms, and social media channels.

Users can interact with these bots using text or voice to complete tasks, access information, or execute transactions. These bots are powered by technologies like Artificial Intelligence, Natural Language Processing and Machine Learning.

What Do the Numbers Say?

C-suite executives around the world believe that AI bots will play a critical role in the enterprise architecture of the future and will make a huge impact on the company’s operations. Statistics too seem to swing in the favor of bots.

According to a study, over 40% of large businesses have already implemented the AI Chatbots in some form or will have done so by the end of 2019.

A Gartner report also predicted that by 2020, 85% of customers will interact with businesses without any customer service agents. And Oracle reported that at least 80% of businesses have already implemented or are planning to implement AI as a customer service solution by 2020.

These numbers definitely reflect an upward rise in the use of Chabot’s, as more mainstream businesses recognize their potential.

Why Conversational Bots?

Bots are great for repetitive jobs that involve simple queries and tasks. Chatbots facilitate intelligent dialogue between people as well as systems, so getting tasks done is as simple and fast as sending a text.  With conversational AI, businesses can:

  • Lower the customer service cost, enhance customer satisfaction and loyalty.
  • Increase sales across digital commerce channels with personalized human-like bots, available 24×7.
  • Enhance employee satisfaction as well as productivity by automating routine and high-frequency customer interactions.

What Are the Different Types on Bots You Can Deploy?

The conversational bots commonly deployed by organizations include:

Informational Bots

These bots can be used to resolve customer as well as employee queries. They provide customer and context-specific results that can be accessed through voice or text to reduce the effort required to get accurate results. These bots can be used to boost employee and customer engagement by pushing tailored product knowledge.

Transactional Bots

These bots serve as powerful interfaces for mobile apps via which customers can order food, book tickets, or manage bank accounts. These bots can also be trained to provide customer service. Retailers and remittance firms are also amongst the early adopters of these bots.

Enterprise Productivity

These bots can be customized as per the company’s needs and can be used to streamline enterprise work activities, connect enterprise data resources, and improve efficiencies. These bots can be used to schedule meetings, improve decision making, and foster greater collaboration.

Device Control

These bots support conversational interfaces which allow connected devices like wearables, consumer electronics, and vehicles to interact with each other — thus enriching user experience. For instance, devices with virtual assistants can work with smart home devices like lights, thermostats, and fridge and prove out to be a boon for home automation. Several automobile manufacturers are also introducing their own Alexa-based capabilities in vehicles. These will enable the owners to check the fuel or charge levels of the vehicle or start their vehicles with just a few spoken commands.


The future of bots is bright with its possible benefits too difficult to ignore. Fueled by the advances in deep learning and big data collection, NLP and computer vision algorithms are getting better in deciphering the world around us and understanding coded human communication. 

Indeed, Organizations across industries are discovering the potential of conversations bots to help streamline and automate activities, boost employee and customer engagement and improve productivity.

What do you think about chatbots? We would love to know your views, experiences, and thoughts. Comment and let us know!

Hybrid Computer-Human Intelligence

My friends at GRAKN.AI recently published an interesting article lamenting that “machines should be able to outperform humans in many more tasks than they currently can, or at least that they should be able to make truly smart predictions.”  

The article makes the point that AI has cracked one of the key attributes of human intelligence — learning — but still has some way to go with logical reasoning over a representation of knowledge.

How do we help artificial intelligence to reason? It is so innate to us that we don’t even know we are doing it.

Take a simple example:

  • If grass is not an animal.
  • If vegetarians only eat things that are not animals.
  • If sheep only eat grass.

It is possible to infer the following:

  • Then sheep are vegetarians.

The ‘if’ statements can be seen as a set of premises. If all the premises are met, we infer through reasoning the new fact that sheep are vegetarians.

Reasoning works on existing data to build new information, adding value in the process. It is fundamental to propelling AI to the next level. Reasoning relies on context, which is how items relate to each other in the real world. To use reasoning for a given data point, we need to know what type of data it is and how it relates to other data points. This forms the basis of knowledge representation and plays a key role in the creation of intelligent systems, enabling them to make sense of complexity.

The GRAKN.AI piece describes the area of graph learning, a new research area where some of the most promising models are Graph Convolutional Networks (GCN). In this article, I want to take a step back and look at another form of computer reasoning found in hybrid intelligence, where AI and humans collaborate.

In hybrid intelligence, machines learn to make decisions about how to perform tasks alongside humans. You can find out more in a paper from 2016, which reviews systems focusing on reasoning methods for computers to optimize how and when they “access” human intelligence to gain help.

We are all familiar with the idea of a semi-autonomous car: a self-driving system that has a human driver onboard to take over in emergencies. Furthermore, we know this isn’t always a successful collaboration, as a fatal accident has illustrated. However, there is a belief that a successful hybrid system would find a way to offload certain computational tasks to humans where necessary, using reasoning capabilities to make effective decisions about when to ask human intelligence to step in.

In the business world, Cindicator is a startup combining human analysts with machine learning models to make investment decisions. As their white paper describes, Cindicator takes a number of diverse financial analysts and a set of machine-learning models and combines them to manage financial investments.

In scientific research, crowdsourcing is a good example of where hybrid intelligence can shine. To date, crowdsourcing typically involves a group of people working collectively on tasks such as image labeling, with their computers mostly involved passively by providing a platform within which they collaborate. However, for efficiency, it is possible for AI to “triage” tasks and make decisions about when to ask for a contribution from humans.

One such example is CrowdSynth, a large-scale crowdsourcing system for citizen science in the Galaxy Zoo project. In Galaxy Zoo, volunteers provide votes about the correct classifications of millions of galaxies that have been recorded in an automated sky survey. (Crowdsourcing provides a way for astronomers to reach a large group of workers around the world and collect millions of classifications under the assumption that the consensus of many workers provide the correct classification of a galaxy from a choice of 6 possible classes: elliptical galaxy, clockwise spiral galaxy, anticlockwise spiral galaxy, another spiral galaxy, and stars and mergers).

CrowdSynth is a model that combines Machine Learning and decision-theoretic optimization techniques to pull together the complementary strengths of humans and machines. Each Galaxy Zoo task uses automated computer vision and then combines this with supervised learning to infer its accuracy when compared to the accuracy of human Galaxy Zoo workers. It trades off the value of acquiring assessment from a human worker with the time and financial cost of involving a human, so it optimizes reliance on human intelligence to edge cases where it is unsure of its analysis. CrowdSynth was found to achieve the maximum accuracy using just 47% of the original set of human workers involved in the analysis. When working under a fixed budget, the gains from using CrowdSynth allow scientists to use their human “resource” more intelligently and efficiently.

An AI system grappling with the decision of accessing human help needs to have an understanding of the capabilities of its helper and the costs and constraints associated with asking for help.  It can even be tweaked to make effective decisions about the best worker to hire (who’s best at spotting spiral galaxies?) and the best task to assign to workers as they become available.

Perhaps it will be a while before machine intelligence outperforms us, but hybrid intelligence model provides a way for us to work together. Here’s to collaboration! 

Using Deep Search in Streaming Big Data Flows

Deep Speech With Apache NiFi 1.8

Tools: Python 3.6, PyAudio, TensorFlow, Deep Speech, Shell, Apache NiFi

Why: Speech-to-Text

Use Case: Voice control and recognition.

Series: Holiday Use Case: Turn on Holiday Lights and Music on command.

Cool Factor: Ever want to run a query on Live Ingested Voice Commands?

Other Options: Voice Controlled with AIY Voice and NiFi

We are using Python 3.6 to write some code around PyAudio, TensorFlow, and Deep Speech to capture audio, store it in a wave file, and then process it with Deep Speech to extract some text. This example is running in OSX without a GPU on Tensorflow v1.11.

The Mozilla Github repo for their Deep Speech implementation has nice getting-started information that I used to integrate our flow with Apache NiFi.

Installation as per Deep Speech

pip3 install deepspeech
wget -O - | tar xvfz -

This pre-trained model is available for English. For other languages, you will need to build your own. You can use a beef HDP 3.1 cluster to train this.

5 Free eBooks to Help You Learn Machine Learning in 2019

Today, Machine Learning is one of the most important trends in every area of software engineering. No longer limited to researchers and analysts, it’s a vital part of everything from cybersecurity to web development. 

To help you get started with Machine Learning, we’ve put together this list of 5 free Machine Learning eBooks from Packt. You can download as many of them as you like — all you’ll need to do is register when you download your first title.

1. Learning Python

Okay, full disclosure — Learning Python isn’t specifically a book about Machine Learning. But there’s an important reason it’s the first free eBook on this list: Python is the go-to language if you want to develop Machine Learning models.

This book will help you get up and running with Python if you’re new to the language. You’ll find that it’s actually an incredibly intuitive programming language that is adaptable and flexible for a range of purposes.

Covering the fundamentals of the language, the book will give you a solid foundation in Python before taking you through some of the core areas where Python can be used. Yes, this includes data science and Machine Learning, but it also features guidance on how to use Python in web and application development projects.

Download Learning Python for free.

2. Python Machine Learning

Python Machine Learning is one of the bestselling books on Machine Learning of the last decade. There are a number of reasons for this: Python, as we’ve already seen, has quickly become the definitive Machine Learning language, while author Sebastian Raschka is someone at the cutting edge of Machine Learning and AI research able to translate the subject into something practical and accessible.

Taking you through the data pipeline step by step, and demonstrating how to use the leading machine and Deep Learning libraries, such as scikit-learn and TensorFlow, Python Machine Learning is a vital addition to anyone’s Machine Learning and AI learning plan.

Download Python Machine Learning for free.

3. Python Deep Learning

Deep Learning is the cutting edge of Machine Learning. Put simply, it’s Machine Learning with increased complexity and sophistication, which can then power different forms of Artificial Intelligence.

Python Deep Learning will build on existing Python and Machine Learning knowledge to build more detailed Deep Learning models that can be applied to various areas, including image recognition, and games.

Download Python Deep Learning for free.

4. Artificial Intelligence with Python

The hype around Artificial Intelligence has reached fever pitch, crossing over into the public domain and impacting everything — including politics.

Any of these free eBooks will help you get beneath the hype and explore how to actually make Deep Learning and Artificial Intelligence work, but Artificial Intelligence with Python goes straight into the topic. Featuring more advanced concepts that will put your existing knowledge and skills to the test, it is a book that aims to show you how to implement Artificial Intelligence systems as practically as possible.

That means you’ll learn not only the programming concepts and techniques that make Artificial Intelligence, you’ll also be able to put that learning into practice and build your own speech and text recognition systems.

Download Artificial Intelligence with Python for free.

5. Advanced Python Machine Learning

If you’re looking for another guide that will challenge and push you Advanced Python Machine Learning will guide you through some of the most cutting-edge techniques in the field. This will not only help you develop even better Python Machine Learning solutions, but it will also help you understand the language in more detail. In turn, this will give you an even better command over one of the fastest growing languages on the planet.

Yes, Open Offices Can Work (Just Not the Way You’d Think)

An open-plan office seems like a no-brainer for a company that makes software designed to increase transparency and collaboration between teams. But they come with some serious trade-offs. When it came time to design Atlassian’s new office space in San Francisco, we got brutally honest with ourselves and had to face facts: the standard approach just doesn’t work.

If you work in an open-plan office that includes members of the C-suite as well as engineers, analysts, HR, accountants, and marketers (just to name a few), then you already know the ideal work environment varies not just by team, but by individual. No wonder open offices are such a polarizing topic! Nonetheless, open workspaces are here to stay. So can they actually, y’know…work?

Yes. They can – with a few hacks. We looked at the established norms of office design, including perimeter offices for executives, cubicle culture, massive open layouts, and conference rooms. We conducted internal and external research. We asked if open offices are even worth it. Then we turned it up, turned it up, turned it upside-down. The result? An office that is open and true to our company values without forcing openness on anyone.

This is what the future of open office spaces looks like.

Democratizing the Corner Office

Even in our highly connected and mobile world, most people still like to have a desk to call home base. But who gets the prime real estate by the windows? While many companies would automatically reserve that space for senior executives, we did the opposite. Instead of pushing staff seating to the core of the building, we brought it out to the window line and interspersed soft seating areas and sound-proof “phone booth” pods amongst groups of sit-stand desks.