If you are here for the first time (or come back frequently), there’s a new Row Level Security page (up in the top toolbar) which gives an overview of your options in Tableau and links to the individual blog posts that dive into detail. Recommended reading for everyone, and it will be kept up to date over time with any changes or additions to functionality.
Since there’s been so much time and better examples and code, I went back and did a major revision of the The Tenets of Tableau Templates on Multi-tenants which I highly advise everyone reading. It’s the most thorough explanation out there of how to correctly handle SaaS / Multi-Tenancy or Dev->Test->Prod promotion. And no, you do not need Interworks PowerTools to do this process, although they do have some nice features.
If you were ever wondering why there is both a REST API and a Document API produced by Tableau, or why we at this blog put out tableau_tools implementing both of those functionalities (and more!), this use case will illustrate it clearly.
The desired action: Specify a workbook on one Tableau Server site to be downloaded and published to a different Tableau Server site (we’ll call this “replicating over”).
Why it is complicated: Best practice with Tableau Workbooks is to Publish their Data Sources separately, to aid in managing the metadata and to provide for unbreakable Row Level Security, among other great reasons. This means we need to download any Published Data Sources that the Workbook is connected to, and publish them over to the new site as well. Simple enough, right?
After a lot of research and testing, the following steps are required to accomplish this correctly:
- Download all of the workbooks you are interested in using the REST API
- Makes sure to do this one Project at a time, because Workbooks can have the same name if they are in different Projects
- Open up each of the workbook files to look at which published data sources (use tableau_tools.tableau_documents)
- Scan through all of the datasource elements in the Workbook XML.
- Check to see if each datasource is a published data sources
- If a published data source is found, find the contentUrl referenced within
- Query all Data Sources using the REST API. Search for any Data Source whose contentURL attribute matches one of those from the workbooks
- Download the matching data sources using the REST API
- Publish the data sources across to the new Site
- You will need to provide the credentials for any data source at publish time, since there is no way to securely retrieve them from the originating site
- Once published, retrieve the details from the new Data Source on the new site, including the new contentUrl property
- Reopen the workbook file, then change the Site and Data Source cotentUrls to match the the newly published Data Sources on the destination site
- Publish the workbook using the REST API
Luckily, all of this is possible using tableau_tools, and there is a sample script available now showing how to do it.
Later versions of Microsoft Analysis Services (MSAS) allow you to configure user and role based data security within the cube itself. However, this functionality only works when that particular user is logged in directly to the cube. In Tableau, this can be accomplished via Kerberos.
What about when you are using MSAS cubes in an external facing solution, with users who are not in the local domain? Cube connections in Tableau don’t have the equivalent of a Data Source Filter the way relational database connections do, and there is no way to pass the USERNAME() function into a Calculated Member the way you can in a relational calculated field.
In this case, the manual “User Filter” functionality can achieve a reasonable solution.
With the release of tableau_tools 4.0.0 and Tableau Server 10.5, most of the pieces are in place in the library and in the product itself to allow for an efficient path for publishing unique extracts for different customers all from a single original data source (or workbook) template.
The basics steps of the technique are:
- Create a template live connection to a database table, Custom SQL or a Stored Procedure in Tableau Desktop. This does not need to be the final table/custom SQL or Stored Proc; you can use a test or QA data source and switch it programmatically to the final source
- Optional: Set up your the appropriate filtering for a single customer / user / etc. — whatever the main filtering field will be. You can instead add this later programmatically.
- Save that file (TDS or TWB)
- Use the tableau_tools.tableau_documents sub-module to programmatically add any additional filters or modify the filters / parameters you set
- Use tableau_tools to alter the actual table / SP / Custom SQL to the final version of that customer
- Add an extract to that data source in tableau_tools. This will use the Extract API / SDK to generate an empty extract with the bare minimum of requirements to allow it to publish and refresh
- Save the new file. It will be saved as a TWBX or TDSX, based on the input file type
- Publish the file to Tableau Server
- Send an Extract Refresh command to Tableau Server using the REST API (using the tableau_tools.tableau_rest_api sub-module).
- Extract will refresh based on the information in the TDS and be filled out with information just for the specified customer/user/whatever you filtered
I haven’t been announcing the minor point releases of tableau_tools lately, but 4.4.0 is out with a lot of good new stuff:
- Updated to work with the Extract API 2.0, so you can add the necessary Hyper extracts to 10.5 data sources and workbooks
- Fully updated and documented mechanism for altering the main table of existing data sources. Change the table name or Custom SQL or…
- Stored Procedure Parameters can be accessed and set
- Tableau Parameters can now be added, removed or modified
As always, preferred install is from PyPi using pip install tableau_tools –upgrade or you can see the source at the Releases on GitHub. See the full documentation in the README .
In this post, I’ll be describing a set of steps to follow to isolate the causes of performance issues on Tableau Server.
Here are the basic steps:
- Test the workbook in Tableau Desktop. Does it perform well? If yes:
- Test the workbook in Tableau Desktop on the Tableau Server machine. Does it perform the same as it did on the previous machine? If yes:
- Publish the workbook to Tableau Server, and find a time when there is low-to-no usage on the Tableau Server. Go to the published workbook. Did it perform relatively the same as the test in Step 2 (within 1-3 seconds)? If yes:
- Test the workbook during a time of high usage on the Tableau Server (either natural or do load testing using TabJolt).