Practice Free C_DS_43 Exam Online Questions
You want to execute two dataflows in parallel in SAP Data Services.
How can you achieve this?
- A . Create a workflow containing two dataflows and connect them with a line.
- B . Create a workflow containing two dataflows without connecting them with a line.
- C . Create a workflow containg two dataflows and deselect the execute Only once property of the workflow.
- D . Create a workflow containing two dataflows and set a degree of parallelism to 2.
What are advantages of using the validation transform in SAP Data services? There are 3 correct answers to this question
- A . You can see which rules were violated in one output
- B . You can set different failed paths for each rule
- C . You can have multiple rules on a single column.
- D . You can produce statistics
- E . You can call a recovery dataflow
You need to import metadata and extract data from an SAP ERP system using SAP Data Services.
Which type of datastore must you use?
- A . Database datastore
- B . Application datastore
- C . Web Services datastore
- D . Adapter datastore
An SAP data services job was executed in the past.
Where can you see the order that the dataflows were executed in? There are 2 correct answers to this question.
- A . In the operational dashboard
- B . In the impact and Lineage Analysis report
- C . In the job trace log.
- D . In the job server log
You have a workflow containing two dataflows. The second dataflow should only run if the first one finished successfully.
How would you achieve this in SAP Data Services.
- A . Use a conditional for the second dataflow
- B . Embed the first dataflow in a try-catch
- C . Add a script between the dataflows using the error_number() function
- D . Connect the two dataflows with line
An SAP Data Services dataflow adds the changed data (insert and update) into a target table every day.
How do you design the dataflow to ensure that a partially executed dataflow recovers automatically the next time it is executed? 2 correct answers
- A . Enable the Delete data before load target table loader option
- B . Add lookup function in the where clause to filter out existing rows.
- C . Set the autocorrect load option in the target table loader option
- D . Use the table comparison transform before the table loader
The performance of a dataflow is slow in SAP Data Services.
How can you see which part of the operations is pushed down to the source database? Note: the are 2 correct answers to this question.
- A . by opening the auto documentation page in the Data Services Management Console
- B . By enabling corresponding trace options in the job execution dialog.
- C . By opening the dataflow and using the view optimized SQL feature.
- D . By starting the job in debubg mode.
In which situation is it appropriate to use a time – based CDC to capture changes in source data with SAP Data services
- A . When there are large tables with few changes
- B . when you need to capture physical deletes from source
- C . When almost all of the rows have changes
- D . When you need to capture intermediate changes
You want to use on SAP data services transform to split your source vendor data into three branches, based on the country code.
Which transform do you use?
- A . Map_Operation transform
- B . Validation transform
- C . Case transform
- D . Country ID transform
What requirement must you meet when mapping an output column on the SAP Data Services query transform mapping tab?
- A . Primary keys in the input schema must be mapped to only one column in the output schema
- B . Each column in the output schema must be mapped to one or more columns in the input schema
- C . All columns of the input schema must be mapped to the output schema
- D . Every column of the output schema must have a mapping
