Row Level Security

The basic concept of Row Level Security is that the Username and Groups of a Tableau Server user are available in a variety of ways that can filter each query that Tableau makes, displaying only the data that the user should see. Tableau does not handle filtering data for an individual user using Permissions. Instead, this is implemented as Row Level Security either at the Data Source or Workbook level.

Tableau Desktop has a “Create User Filter” option in the Server menu, but it is completely manual and hard-coded. For dynamic row level security that ties into a database or a security service, you’ll need to connect the user information to the data that exists somehow.

All secure techniques for achieving Row Level Security in Tableau Server depend on each user having a distinct username in Tableau Server and that the user be logged in as that username. Syncing usernames (and groups) can be achieved via Active Directory, LDAP (on Linux) or from any source using the Tableau Server REST API.

Currently (as of Tableau 2018.2), the techniques implementing Row Level Security for Live Connections and Extracts differ, owing mostly to the fact that Extracts are a denormalized, single table.

Live Connection Row Level Security

There are several methods for implementing Row Level Security using a Live Connection. The vast majority of them are based on the concept of using the User Functions in Tableau to match against a column in a database, so that the resulting rows limit the data that can be seen. The standard best practice is outlined in the following article:

How to set up your Database for Row Level Security in Tableau

If you need more complex logic, or want to call to a function or stored procedure that handles the security entitlements, it is possible to use the RAWSQL functions in Tableau to specify the actions you want to perform:

Using Pass-Through Functions (RAWSQL) for Row-Level Security

If you want to pass the security entitlements into a workbook at load time, with no entitlements stored in the database, you must take care to build a Parameter + Data Source Filter combination that is secure

Securely Passing Parameters into a Tableau Viz at Load Time

When using Stored Procedures (in the RDBMSs that are supported by Tableau), you are faced with dilemma: Tableau can link Tableau Parameters (but not the User Functions) to Stored Procedure Parameter values, but Tableau Parameters can be changed by an end user via the URL or JavaScript and thus are not secure.

You have two options:

Extract Row Level Security

2018.3 and Later Versions

Starting in Tableau 2018.3, the Hyper extract engine has been updated with an option to bring in Multiple Tables, specifically with the intention of making Row Level Security easier to implement. Using Multiple Tables can vastly reduce the time it takes to generate an extract, by completely avoiding the “row duplication blow-up”.

Read more about how it works and how to implement it here:

Multiple Table (Normalized) Hyper Extracts

Prior to 2018.3 or when using “Single Table” extracts in 2018.3+

Tableau Extracts take whatever table relationships from the Live Connection screen and combine them into a single table through one query, the results of which are transformed in a single table in the Extract file. This process is technically called “denormalization”, and often results in the number of rows multiplying, which is called, very technically, “blowing up”.

To implement Row Level Security without your data “blowing up”, you should use the CONTAINS() method from Part Two of the blog posts below:

One other element that should also be taken into account is that if you have multiple different clients, you may want to make separate extracts for each of them, so that the Extract itself only has the appropriate data for each client. Instructions for this type of template Data Source creation and maintenance are contained in:

The Tenets of Tableau Templates on Multi-tenants

Token Authentication / Authorization

Many organizations have moved to having their authentication and authorization come from another service which generates auth tokens, which include entitlement information through along with the user.

It is always recommended that you still log the user into Tableau using an SSO method (if you are embedding) or a secure method like Kerberos if they are on your local domain.

If your database is set up to use auth tokens, there are two mechanisms by which you could get the tokens into the database:

  • Pass the token in as a Tableau Parameter, then have it interpreted by a function using RAWSQL or have that Tableau Parameter linked to a Stored Procedure parameter (the auth token will need to be unbreakable / alterable)
  • If possible, the auth token could be requested by the database based on the username, using a Stored Procedure or other function using Initial SQL.

Web Services / RESTful data

While Tableau provides the Web Data Connector framework for accessing web services / REST APIs, it is built around a single-user authentication framework — in essence, whoever authenticated at publish time to Tableau Server is the only user who a WDC can authenticate as. There is no API for changing those credentials or publishing variations, so it doesn’t really scale up.

If you need to have the equivalent of a “live” connection to data from a web service, the “Live” Web Services Connections in Tableau post describes an architecture that will achieve the effect.

If the RESTful web service calls are generated from simple Stored Procedures in a Databases, you can instead just connect directly to those Stored Procedures, using some of the techniques from earlier in this article to pass in the authorization parameters.

You can also extract out larger subsets of data than a per user basis and then use the Extracts Row Level Security techniques from above to achieve the filtering. Depending on the size of your data, that may be the easiest technique to get the best performance. If doing this, I would recommend the Extract API over the Web Data Connector, as you can offload the extract generation onto another machine and build in your variations directly into your generation program.