You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hello Team,
We are working on extracting data from Facebook API using the tap-facebook extractor. We were able to extract and load the data but when we are comparing the spend with the Facebook data, (we have a python file that is extracting the data directly from Facebook UI using the Facebook API) it's not matching with the data we have in the UI.
We are missing lot of data and would love some assistance. If someone has faced or can give suggestions on how to resolve this issue that would be great.
Context:
Before (up until mid-December) our queries were showing ~10,000 rows loaded per day and now we're down to ~250 rows loaded per day.
Going off of the Meltano status, all data should be loaded but we're not seeing that when we query the data.
We have used batch sizes: 50, 100, 1000, 3000, 4000, 6000, 8000, 10000 using the variable: max_batch_rows
The pipeline is currently the pipeline is running at an hourly basis.
Total rows
Facebook UI: 5073620
Meltano: 382000
We are using _sdc_batched_at to count the daily number of total rows.
Could this be causing an issue since _sdc_batched_at gets updated?
The text was updated successfully, but these errors were encountered:
Hello Team,
We are working on extracting data from Facebook API using the tap-facebook extractor. We were able to extract and load the data but when we are comparing the spend with the Facebook data, (we have a python file that is extracting the data directly from Facebook UI using the Facebook API) it's not matching with the data we have in the UI.
We are missing lot of data and would love some assistance. If someone has faced or can give suggestions on how to resolve this issue that would be great.
Context:
Before (up until mid-December) our queries were showing ~10,000 rows loaded per day and now we're down to ~250 rows loaded per day.
Going off of the Meltano status, all data should be loaded but we're not seeing that when we query the data.
We have used batch sizes: 50, 100, 1000, 3000, 4000, 6000, 8000, 10000 using the variable: max_batch_rows
The pipeline is currently the pipeline is running at an hourly basis.
Total rows
Facebook UI: 5073620
Meltano: 382000
We are using _sdc_batched_at to count the daily number of total rows.
Could this be causing an issue since _sdc_batched_at gets updated?
The text was updated successfully, but these errors were encountered: