

TABLEAU CONNECT TO AWS POSTGRESQL DATABASE UPDATE
This is done just once (unless your data keeps changing shape - in which case you need to run the crawler again in order to update the schema). You can also manually make changes to the detected schema. The data could be csv, json, xml, or custom (grok - see github and logz.io). You provide it with a sample file (I will walk you through it) and it figures out for itself how the data is structured.

If this is a web application then it needs to remain responsive. For example, a product SKU may need to be supplemented by an id that is internal to your database or an email may need to be converted to lowercase or to a user id. There may also be logic that needs to be applied so that the data is converted to something that is useful for the application. Dates should be turned into a date object of some sort, numbers should be converted to numbers etc. Transforming the data into the format you want.The list is endless and therefore the problem is also a hard one to solve. Or, you have been expecting the data to be in a particular order but the user switched the columns’ order around. For example, you may have asked for csv and the user instead uploaded tab delimited (or excel!). Users will upload data that may not conform to what you have been expecting. The upload comes with lots of problems of its own. This is fine for a low volume of data, but once there is more than a couple of rows then an upload makes sense. My approach up to now was to try and avoid at all costs the file upload and provide a form instead. It is a common feature of an application to ask the user to upload a file with data.

I have spent a rather large part of my time coding scripts for importing data from a file into the database. AWS Glue for loading data from a file to the database (Extract, Transform, Load)
