Dynamic lookup cache not updating
For a Dynamic Look UP on Target, we have the option of associating any Look UP port with an input port, output port, or Sequence-ID.
When we associate a Sequence-ID, the Integration Service generates a unique Integer value for each inserted rows in the lookup cache., but this is applicable for the ports with Bigint, Integer or Small Integer data type.
Suppose, we have a session s_New_Customer, which loads the Customer Dimension table.
Before that session in the Workflow, we add a dummy session as s_Dummy. $$MAX_CUST_SK which will be set with the value of MAX (SK) in Customer Dimension table.
Here in this article we will concentrate on different approaches to generate Surrogate Key for different type ETL process.
When you have a single dimension table loading in parallel from different application data sources, special care should be given to make sure that no keys are duplicated. This is the simplest and most preferred way to generate Surrogate Key(SK).
Here the O_COUNTER will give the SKs to be populated in CUSTOMER_KEY.
@acchen97 can you update the description of this ticket (or close it and open a new one) to reflect some of the recent work in this area?
In one case I am using the Translate filter to lookup certain values and if nothing matches I have the ruby filter execute a Go program that queries a HTTP api, returns the result and appends the results to the translate dictionary.
The issue is that if there are for example 100 incoming messages with the same value that does not exist in the dictionary the HTTP api will be hit 100 times, if there would be some way to trigger a reload of the dictionary if the file changes then that would be extremely valuable.
Logstash should have more dynamic ways to lookup and enrich events, especially with external user-defined datasets.
Currently, the main venue of lookup enrichment comes from the translate filter, which is primarily basic key/value lookup and only supports YAML.