What are the must-know software skills for a career in data analytics after an MBA?

Standard

SQL, Excel & Tableau-like tools are good enough to start. Then add something like R eventually. And then there are tools that are specific to the industry – example: Google Analytics for the tech industry.

Other than that, you should know what do with these tools. You need to know following concepts and continuously build upon that as the industry use-cases and needs evolve:

  1. Spreadsheet modeling
  2. Forecasting
  3. Customer Segmentation
  4. Root cause Analysis
  5. Data Visualization and Dash-boarding
  6. Customer Lifetime value
  7. A/B testing
  8. Web Analytics

VIEW QUESTION ON QUORA

Any advice for moving into data science from business intelligence?

Standard

This was asked on Reddit: Any advice for moving into data science from business intelligence?

Here’s my answer:

I come from “Business Intelligence” background and currently work as Sr. Data Scientist. I found that you need two things to transition into data science:

Data Culture: A company where the data culture is such that managers/executives ask big questions that need a data science approach to solve it. If your end-consumers are still asking bunch of “what” questions then your company might NOT be ready for data science. But if your CEO comes to you and says “hey, I got the customer list with the info I asked for but can you help me understand which of these customers might churn next quarter?” — then you have a data science problem at hand. So, try to find companies that have this culture.

Skills: And you need to upgrade your skills to be able to solve data science problems. BI is focused too much on technology and automation and so may need to unlearn few things. For example: Automation is not always important since you might work on problems where a model is needed to predict just a couple of times. Trying to automate wouldn’t be optimal in that case. Also, BI relies heavily on tools but in Data science, you’ll need deeper domain knowledge & problem-solving approach along with technical skills.

Also, I personally moved from BI (as a consultant) -> Analytics (as Analytics Manager) -> Data science (Sr Data Scientist) and this has been super helpful for me. I recommend to transition into Analytics first and then eventually breaking into data science.

Hope that helps!

VIEW THREAD ON REDDIT

Looker vs Tableau: How would you compare them in terms of price & capabilities?

Standard

Someone asked this on quora so here’s my response: This is a great question — one that I figured it out when I led Analytics at Kiva[.]Org last year so I am happy to add my perspective here on Looker vs Tableau:

Looker vs Tableau

Let’s talk about capabilities first and then price.

Capabilities — Looker vs Tableau

Even though both of these tools are classified under Business Intelligence, they have some pretty clear product differentiation so in this section, I will share that. I will share the three main components of Business Intelligence platforms and then map it back to core strengths of each product.

Business Intelligence platforms typically has three main components:

  1. Data Collection, Storage & Access
  2. Data Modeling
  3. Data Visualization

#1) Data Collection, Storage & Access: Both of these tools don’t do data collection & storage. You will need infrastructure to collect data and store it — typically it is stored in databases. And you can access data from databases using SQL. You will need to connect to these data sources from either of these tools and access data — Note that: On the surface, it might look like Tableau supports more data sources than Looker but there might be workaround to get your data into one of the data sources supported by Looker and take it from there and so I am not awarding extra points to Tableau for this. Also, I am personally a big proponent of using Analytic databases like redshift, vertica, bigquery & Azure DW for Analytics applications which Looker & Tableau both support so calling it a tie here!

#2) Data Modeling: This is Looker’s core strength by a wide margin! Why? This is because of their LookML which is their data modeling layer and I am super impressed by this after using it for a while now! So let’s chat about what data modeling layer means and why you should care.

Data modeling (in this context) means creating data models that take your raw data as input and then it’s cleaned, combined, curated & converted and made ready for data analysis.

Why is this important? Not everyone can clean, curate, combine & convert raw data into analysis-friendly data assets. That’s what data analysts are trained and specialize in. May in the future we will have tools that do that OR maybe we will see plug-and-play (aka turnkey) solutions for few key analysis needs but for now, you need data analysts that can create these data models.

Now there are two ways to create data models:

You can create them on-the-fly (ad-hoc) OR you can publish all of these data models on a platform (like Looker).

There are all sorts of issues with doing it on-the-fly — it works for small teams (<20–30 people) but more than that you need to have some process in place. For instance: You can’t automate data models that you need often so that’s wasted time, Also, you can’t share these models easily with others, creates a single point of failure and if the analyst person is sick or on vacation then no-one gets “insights” from data — the world stops spinning. Yada Yada Yada…So self-service is good after you have few business users who want to consume data.

So what does a self-service platform bring to the table? They help data analyst build these wonderful data-analysis friendly models and publish them so everyone who cares in an org can access it. So the consumer can focus on analysis part and not worry about doing the not-so-good part of making it ready for analysis. Also, this ensures all sorts of other benefits: standardized metric definitions, trusted data sources, better collaboration among analysts, speedier model-delivery process, get out of excel hell and what not!

Think of this way: If you have all key data model available on your self-service platform then your data analyst can focus on 1) advance stuff = more $$$ 2) building more data models (and so eventually they can do more advanced stuff later and more $$$!)

Looker vs Tableau

This is where Looker fits in. Looker is great at this data modeling thing — it’s platform is amazing for anyone looking to solve this problem. You can also do data visualization on top and build dashboards.

Alright, moving on:

#3) Data Visualization: This is Tableau’s forte! No one does data viz better than Tableau, at least right now. There are vendors that are investing significant resources on this and they are close but still Tableau is a leader in this space.

Having said that, let’s map it back to how it help business users & analysts:

Business users and self-service environments:

Tableau is not great at data modeling thing. Yes, you can do basic clean, combine, curate & convert thingy but it doesn’t work well with intermediate to advanced needs. So if you have a self-service data modeling layer already in place that Tableau can connect to and you are looking for a data visualization layer then go for Tableau! You would be able to create some amazing visuals, dashboards and stories that will WOW your business users! But to make sure this scales you need to seriously think about 1) how to overcome the limitations in tableau’s data modeling layer OR 2) use some other tool to build this data modeling layer and connect Tableau to it.

Pro Tip: I highly recommend trying out trials of these products and seeing what works best!

Analysts:

Tableau shines at data discovery! While this certainly helps business users, it’s best leveraged by analyst because whenever they are working on ad-hoc data analysis (one-time, strategic in nature) projects they can be much more effective and discover the underlying trends and patterns in their data by visualizing it using Tableau.

So with that context you might be wondering, What tool did I champion & Implement at Kiva?

This is public knowledge that Kiva is a Looker customer because it’s Logo is on their website so I can share this.

After evaluating about 30+ tools (including Tableau), I ended up championing and eventually leading the initial implementation sprints to implement Looker at Kiva because the goals & vision that we had for Kiva’s data & analytics platform aligned better with having the data modeling layer that met Kiva’s needs. So you need to figure out your goals and vision and then choose the tools with that framework.

Pro Tip #1: It’s insanely hard to figure out what your goals and vision for analytics in an org. To figure this out, you might want to chat with organizations in the same industry at the same size & stage and see what they use. Ask them about what they use and whether it worked for them. Ask them about their Return on investment. This is a great way to get external feedback but you still need to figure out internal needs and prioritize them.

Pro Tip #2: Both of these tools have amazing reviews! You will see them highly ranked in analyst reports too — this is great but it’s important that ever before to clearly define what your organization needs and then map it back to the core strengths of these products (or any other tool for that matter) and go from there!

[I am happy to help evaluate the right tool for you needs, feel free to contact me: Let’s Connect! – Insight Extractor – Blog ]

Pricing — Looker vs Tableau

I can’t talk about Looker’s pricing because it’s not public, I apologize! You need to contact them to get the quote.

However, you can anchor that with Tableau’s pricing which is public: Buy Tableau | Tableau Webstore

Your analyst and power users will need Tableau Desktop/Professional which is $1K and $2K respectively (one-time thing) and then depending on your deployment model: cloud or self-hosted — the price varies:

Looker Tableau Pricing

*Note that Tableau online is a subscription model so you can definitely start small. Let’s say 5 business users in a department and take it from there. If you grow then you can later look at other tools like Looker. (If you are rapidly growing, account for the non-trivial time needed to migrate from one platform to another and so it might be worth it to pick the right tool from the get-go)

Pro Tip: I will encourage you to think about building a ROI model too. You know use some analytics for your analytics projects 😉 — I apologize, couldn’t resist! Anyhow, the point is that instead of just thinking about the “cost”, think about the value-add and anchor your investment figure to that. There’s a reason some analytics tool are priced at let’s say $1000 vs some tools priced at $100,000 — both of them have different value proposition and if you know how to extract value of the tool and can project it then you can get better ROI!

Hope that helps! If I can be of any further help, email me or comment here! Let’s Connect! – Insight Extractor – Blog

VIEW QUESTION ON QUORA

How do I pursue career in data warehousing?

Standard

Someone asked this on quora, and here’s my reply:

In the data world there are two broad sets of jobs available:

  1. Engineering-oriented: Date engineers, Data Warehousing specialists, Big Data engineer, Business Intelligence engineer— all of these roles are focused on building that data pipeline using code/tools to get the data in some centralized location
  2. Business-oriented: Data Analyst, Data scientist — all of these roles involve using data (from those centralized sources) and helping business leaders make better decisions. *

*smaller companies (or startups) tend to have roles where small teams(or just one person) do it all so the distinction is not that apparent.

So, it seems like you are interested in engineering-oriented roles — the role that focused on building data pipelines. Since you are starting out, I would suggest that you broaden the scope to learn about other tools as well. While data warehousing is still relevant and will be in some form or another for next few years, Industry (especially tech companies) have been slowly moving towards Big Data technologies and you need to be able to adapt to these changes. So learn about data warehousing, may be get a job/internship as a ETL/BI engineer but keep an eye out on other data engineering related tools like Hadoop ecosystem, spark, python, etc.

VIEW QUESTION ON QUORA

What are the reasons why developing a data dictionary is so important?

Standard

data dictionary

Let me first define “Data Dictionary” — It’s a document that lists data fields/metrics and their standardized definition to be used across the org.

The key here is: Standardized.

Imagine this:

Imagine that a management team meeting is going on and you have CEO, VP of Sales, VP of Marketing, CFO, COO among others in the room.

Meeting Agenda: why they didn’t hit the $100M profit goal in the first quarter. So each of them start with the reports they had access to.

VP of Sales says they missed it by $5M

CFO says that they missed it by $9M

COO says that they missed it by $7M

VP of Marketing has three different versions on her report and she is confused!

No ONE talks about the “Why” they missed the goal but instead spends next hour reconciling the numbers!

It was a hypothetical scenario but these things happen all the time! Of course it could be any team meeting and the metric could be something else or it could just that someone is working on something on their own and end up spending a lot of time digging through all the metric definitions and trying to makes sense of it all. This is where data dictionary could help! Let’s take this a step further:

What’s one of the most important characteristic of a good data analysis/science?

It needs to be Actionable.

It needs to help business decision makers take action based on the insights that they found or were shared with them. And before they take that decision, business decision makers need the data they can TRUST!

For data to be trusted, it needs to be understood. It needs to have a definition that everyone agrees upon.

This is what data dictionary is for. It lists data fields/metrics and their standardized definition so that everyone in the org understands what the field/metric means and don’t have to worry about aligning their meaning. They could focus on Analyzing and extracting insights that would change the business and the world!

VIEW QUESTION ON QUORA

What is the title these days for a person that assures data quality?

Standard

Question:

What is the title these days for a person that assures data quality?
(I need to hire a person to make sure my data is as good as it can be. They need to inspect the data for issues, create logic for how it can be found and fixed, and finally, court the project through application development for a robust solution to stop it from occurring in the first place.)

Answer:

Quality of the data shouldnt be a responsibility of just one person — ideally, you want all members of the team (and broader business community) to care and own some part of it. But i like the idea of one person owning the “co-ordination” of how this gets done. It might not be a full time gig in a small org but can see this as a full time role in bigger orgs and enterprises. Some titles:

  1. data co-ordinator
  2. Data quality analyst (or just data analyst)
  3. Data steward
  4. Master data management analyst
  5. Data quality engineer (or just data engineer)
  6. Project manager (data quality)
  7. Manager, data quality and master data management

Read the original question on Quora

Building data driven companies — 3 P’s framework.

Standard
Data Driven Comapnies need Process Platform People

Data Driven Companies — 3 P’s framework

People:

To build data-driven organization, you need decision makers to use data instead of anything else. So you need to help built a culture where data-driven decision-making thrives — usually this is most efficient if you have executive buy-in. Example: A CEO who is a stats-junkie! Of course, not every company would have this. It could be that you find yourself in an organization where the CEO is known to make huge bets just using “gut” — in cases like this, an organization could have some of the best platform and processes but unfortunately, it won’t do any good.

Now just having people who make data driven decisions is not enough — you (as a data professional) need to deliver “data” to them. To do that you need 1) Processes 2) Platform. So let’s talk about them:

Platform:

A platform in this context is the data and analytics platform used by the organization to get the data they need, when they need it. If the organization is small (e.g. less than 15 or so) then the platform could be excel and engineers/analyst writing ad-hoc queries but as you grow (= team size expands) then you need better platform to serve the data needs of the organization. Some tools are better than others and you would usually wind up using multiple vendors in your analytics stack — but remember that jut having a great analytics platform is not enough. You need the “people” and the “processes” to go with that. So, with that let’s talk about process:

Process:

Process is everything between Platform and People. Let me expand on this. Here are few things where having a defined process is key for building data-driven organizations.

  1. How to prioritize the analytics request? It will be great to have a process where you/team will work on projects that closely align with the strategic objective of the company
  2. What does the analytics org-structure look like? Do you have analyst embedded in each team or do you have a centralized team or do you go for a hybrid approach?
  3. What is the process to justify investment in analytics?
  4. Which is the “right” metric definition? (There needs to be a process that keeps the metric definition standardized in an organization)
  5. What is the process to clean data? (Maintaining data integrity is key. You could put this on “Platform” bucket as well)
  6. How do users get “help”? (Is there a ticketing system that they should use? Is it just another “IT” ticket? Who responds to tickets? What’s the SLA around analytics queue tickets? etc)
  7. Who owns “analytics”? There needs to be someone on the team owns analytics like analytics manager, VP of analytics and he/she should be reporting to someone on management team (CIO, CFO, COO, Chief of Staff, CEO) who is held responsible as well.

The list goes on…but I hope you get the point. Having a well-defined processes in an organization is important — usually, this stuff gets less attention and org’s/teams tend to focus just on “platform” which might not be the best thing to do.

Having shared the 3 P’s, let me share few tips on

How to go about implementing the framework:

Three tips:

  1. Identify the “P” that has the best ROI
  2. It’s an iterative process!
  3. Refine as needed

On #1. To help you identify the “P” that has the best ROI, your first step could be to create a matrix to help you evaluate where your organizations falls. I have shown an example below:

Building Data Driven companies 3 Ps framework matrix

If you want to build analytics from scratch then you would love working at early stage startups (bottom-right) but if you like advanced stuff (data-science) then Top-right corner is great! Also, For org’s in Top Left where you have the platform and processes but lack data-driven people — it would be wise to crank up your efforts to drive adoption. (since you already have the right platform and process than any additional investment here would yield little to no ROI).

On #2. Understand that it’s an iterative process. You are never done optimizing any of these P’s! It’s a journey and not a destination.

Continuos Improvement Process People Platform

On #3: Just like with other frameworks, you’ll need to refine and adjust this based on your needs. You may have noticed that I focused on “Org-wide” framework but you could be heading up an analytics function for a department and in that case, not all of the things here would help. “People”, “Process” and “Platform” would still apply on a high level but it might just be that you don’t have “control” over the platform. So, you may need to refine/adjust this as needed.

I hope the framework is a great tool for you to think about building data driven companies!

Best,
Paras Doshi

PS: If you like articles like this, don’t forget to sign up for the newsletter!