Join the OracleApps88 Telegram group @OracleApps88to get more information on Oracle EBS R12/Oracle Fusion applications.

If you are facing any issues while copying the Code/Script or any issues with Posts, Please send a mail to OracleApp88@Yahoo.com or message me at @apps88 or +91 905 957 4321 in telegram.

Saturday, July 30, 2016

Oracle Apps HRMS Interview Questions


1. Important Tables in HRMS
Per_all_people_F,
per_person_types,
per_person_type_usages,
per_addresses,
per_contact_Relationships,
per_periods_of_service
Per_all_assignments_f,
per_pay_groups,
per_jobs,
per_job_Definitions,
per_grades,
per_grade_definitions,
hr_all_positions,
hr_all_position_definitions,
hr_all_locations,
pay_all_payrolls_F
pay_element_entry_values_F,
pay_element_entries_F,
pay_elements_links_F,
pay_element_types_F

2. Key Flexfields (KFFs) in HRMS
Job KFF,
Grade KFF,
People Group KFF,
Position KFF,
Cost Allocation KFF,
Comptence KFF

3.What are Date Track Tables?
Every update in the Table, we will save the change in the form of a Record to provide the facility to find the information at any point of time.
These tables are post fixed by _F

4. What are secured Views?
The Views which do not have the _all to be said as secured views.
Per_all_people_F
per_people_f
per_all_assignemtns_f
per_assignments_f
pay_all_payrolls_F
per_payrolls_f

What are secured Tables?
The following Oracle HRMS tables are secured:
• HR_ALL_ORGANIZATION_UNITS
• PER_ALL_POSITIONS
• HR_ALL_POSITIONS_F
• PER_ALL_VACANCIES
• PER_ALL_PEOPLE_F
• PER_ALL_ASSIGNMENTS_F
• PAY_ALL_PAYROLLS_F
Some of these tables (namely PER_ALL_PEOPLE_F,
PER_ALL_ASSIGNMENTS_F, HR_ALL_POSITIONS_F, and
PAY_ALL_PAYROLLS_F) are datetracked. The following table details
the views that are based on the secured tables listed above.

5. The difference between both secured views and non-secured views is :
1.      Secured views display information only for the current period
2.      Unsecured views is used to get the information from the entire rows

6. APIs in HRMS
API are used in HR to insert the data into the Base tables. As its very secured system, the user does not have the facility to copy the data directly into the Base tables. When we write the inbound interfaces / use WebAdI, the systems will use the APIs to store the data into system. The API are published by oracle with number of parameters. The different types of parameters are IN / INOUT / OUT. Of these parameters few are mandatory, without which the process wont complete. Generally when we use API we give data for : Object Version Number, Effective Date, P_Validate

HR_EMPLOYEE_api ex: hr_employee_api.create_employee
hr_PERSON_api
hr_organization_api Ex: hr_organization_api.create_organization
hr_applicant_apI
hr_assignment_api

7. What are the reports which you have done in HR?
HR PAYROLL COSTING :
This report is used to display the information about the employees, the assignments which were given to the employees along with the payroll details including the Hours Paid, Salary, NI, Pension, Car Allowances and Other Allowances.

Special Information Types (SIT)
SITs are created from Personal Analysis flexfield, a type of Key Flexfield and are visible across the Business Group
SITs are primarily attached to People related data; can also be used with Jobs, Assignment & Locations
SIT form has a start date and end date associated with the SIT row.
Sometimes, SIT can cause performance issues (though very minor)
SITs have limited security restrictions, thus making them unsuitable for storing sensitive data
The Business thumb rule to be adopted while deciding to go for an SIT is dictated by the nature of the data that is to be stored. Generally individual (personal) data that is not very sensitive can be stored using SIT.

Extra Information Types (EIT)
EITs are a type of Descriptive Key Flexfield; can be restricted to specific Responsibility
EIT's can be attached to People, Assignment, Job, Position, Location and Organization (via a classification)
EIT forms do not have start and end dates. If you want to create dated information, then you would have to use two of the EIT flexfield attributes to store this information.
EIT’s can be used at Person, Assignment, Job, Location level etc. related issues (like Bonus, Incentive info etc.) may be created in EIT
EITs don’t have any performance issue
EITs can be tied to a responsibility so that the appropriate population can also access the EITS. Since EITs are a type of DFF’s it can be migrated across instances using a tool like Kintana which is not possible with Key Flexfields, to which SITs belong.
The Business thumb rule to be adopted while deciding to go for an EIT is dictated by the nature of the data that is to be stored. Generally compensation data that is may be of sensitive nature (like Employee Incentive Bonus plans, Stock option data etc.) can be stored using EIT

What  is Data Pump ? Components of Data Pump in Oracle HRMS?
Oracle delivers seed data to enable Data Pump API calls to use features such as passing in user values instead of system identifiers. This support is not available for all of the APIs that are delivered with Oracle HRMS.

I have used for calling APIs using Data Pump where the supporting seed data is not present.

When purging data from the Data Pump tables, take extra care that you do not delete information on User Keys that you might need for future loading of external data.

The steps for using Data Pump,
Running the Meta-Mapper: This process generates the specific PL/SQL procedures and views for each of the supported API modules you want to use.

Use the Meta-Mapper to generate a set of views that you can use to examine or update data in the batch tables. For example you might want to correct data or change the order in which data is loaded.

Loading Data Into the Batch Tables
Use these two tables to hold the header and lines information from your external data.

HR_PUMP_BATCH_HEADERS

HR_PUMP_BATCH_LINES
Note: The Meta-Mapper creates views based on the batch lines table called HRDPV_<API Procedure Name>, for example, HRDPV_CREATE_EMPLOYEE.

Running the Data Pump Process
We need to use the Submit Reports and Processes form to start the Data Pump Engine process. It takes these parameters:

BATCH NAME
The batch_name is one of the batches inserted via the create_batch_header procedure.

VALIDATE FLAG
Default value for this flag is No. This commits all valid lines to the database.
If the validate flag is set to Yes, the process runs in validation mode. The APIs are called, but their results are rolled back. Use this mode to check and correct data before committing changes to the database.

ACTION PARAMETER GROUP
The action parameter group specifies the set of action parameter values to configure this Data Pump run.
The default value for this parameter is set from the HR: Data Pump Action Parameter Group profile option.

Finding and Fixing Errors
Logging Options
We need to enable logging options for Data Pump by inserting appropriate values in the PAY_ACTION_PARAMETERS_VALUES table for the PUMP_DEBUG_LEVEL parameter.

Note: Turning logging on always affects the overall performance of the data pump process. You should only use logging to help track down problems when they occur. Remember also to switch logging off after you have solved your problem.

How to View Logging Output
When you enable logging options, output is produced for every thread that may be running. Use the PYUPIP command to view this output.

To use this command you will need to know the ID for the concurrent process you are logging. Online you can use the View My Requests window to find the Concurrent Request IDs. Alternatively, you can query from the HR_PUMP_REQUESTS table. One row is inserted for each process that is running. For example:

select * from hr_pump_requests;
Typical output would be:

BATCH_ID    REQUEST_ID      PROCESS_TYPE
----------- --------------- -----------------
8437        98533           MASTER
8437        98534           SLAVE

This tells us that there are two processes running, and the request_id values are 98533 and 98534.

Use PYUPIP to trace the output in a separate command line window. For example:

PYUPIP <user/password>@database REQID98533
PYUPIP <user/password>@database REQID98534
Note: If you are running multiple threads, you should trace all the threads. If you do not choose all threads, this means that the processing comes to halt when the database trace pipe fills up. It may be advisable to run a single thread only when tracing.

Purging Data
To run the Data Pump Purge process:
Enter the name of the batch that you want to purge. If you do not supply a batch name, confirm that you want to purge all batches.
Specify whether you want to preserve the user keys so that you can use them for future batches.
If you want to purge the unprocessed batch lines, enter Yes. However, these lines are probably unprocessed because Data Pump encountered the maximum number of errors and could not process any of the later batch lines. You can run these unprocessed lines again when you have fixed the data. So, if you want to run these batch lines again, enter No.

If you want to purge the failed batch lines, enter Yes. However, if you intend to process these batch lines again when you have corrected the reason for failure, enter No to preserve these batch lines for future use.
If you want to purge the completed batch lines, enter Yes.
If you want to delete the batch header, enter Yes.

Confirm that the action parameter group is correct.

How To Purge
In all cases you should start with the following actions:
TRUNCATE TABLE HR_PUMP_REQUESTS;
TRUNCATE TABLE HR_PUMP_RANGES;

Simple Purge Of All Rows
If you want to purge all rows regardless of status then use the following:
TRUNCATE TABLE HR_PUMP_BATCH_EXCEPTIONS;
TRUNCATE TABLE HR_PUMP_BATCH_LINE_USER_KEYS;
TRUNCATE TABLE HR_PUMP_BATCH_LINES;
TRUNCATE TABLE HR_PUMP_BATCH_HEADERS;

What are Data Pump Interface tables? Data Pump User key tables?

APIs Supported by the GENERATEALL Command

Package Name
Business Process
HR_APPLICANT_API
CREATE_APPLICANT

CREATE_GB_APPLICANT

CREATE_US_APPLICANT
HR_ASSIGNMENT_API
ACTIVATE_EMP_ASG

ACTUAL_TERMINATION_EMP_ASG

CREATE_SECONDARY_EMP_ASG

CREATE_GB_SECONDARY_EMP_ASG

CREATE_US_SECONDARY_EMP_ASG

SUSPEND_EMP_ASG

UPDATE_EMP_ASG

UPDATE_EMP_ASG_CRITERIA

UPDATE_GB_EMP_ASG

UPDATE_US_EMP_ASG
HR_CONTACT_API
CREATE_PERSON
HR_CONTACT_REL_ API
CREATE_CONTACT
HR_EMPLOYEE_API
CREATE_EMPLOYEE

CREATE_GB_EMPLOYEE

CREATE_US_EMPLOYEE
HR_EX_ EMPLOYEE_API
ACTUAL_TERMINATION_EMP

FINAL_PROCESS_EMP
HR_JOB_API
CREATE_JOB
HR_JOB_REQUIREMENT_API
CREATE_JOB_REQUIREMENT
HR_PERSONAL_PAY_METHOD_API
CREATE_GB_PERSONAL_PAY_METHOD

CREATE_PERSONAL_PAY_METHOD

CREATE_US_PERSONAL_PAY_METHOD

DELETE_PERSONAL_PAY_METHOD

UPDATE_PERSONAL_PAY_METHOD

UPDATE_GB_ PERSONAL_PAY_METHOD

UPDATE_US_ PERSONAL_PAY_METHOD
HR_PERSON_ADDRESS_API
CREATE_GB_PERSON_ADDRESS

CREATE_PERSON_ADDRESS

CREATE_US_PERSON_ADDRESS

UPDATE_PERSON_ADDRESS

UPDATE_GB_PERSON_ADDRESS

UPDATE_US_PERSON_ADDRESS
HR_PERSON_API
UPDATE_PERSON

UPDATE_GB_PERSON

UPDATE_US_PERSON
HR_POSITION_API
CREATE_POSITION

UPDATE_POSITION
HR_POSITION_REQUIREMENT_API
CREATE_POSITION_REQUIREMENT
HR_SIT_API
CREATE_SIT
HR_VALID_GRADE_API
CREATE_VALID_GRADE
PY_ELEMENT_ENTRY_API
CREATE_ELEMENT_ENTRY

DELETE_ELEMENT_ENTRY

UPDATE_ELEMENT_ENTRY

When the Data Pump is use in Oracle HRMS?
What are Oracle HRMS Data Pump Tables ? List some

What are User Hooks in Oracle HRMS?
APIs in Oracle HRMS support the addition of custom business logic. This feature  is called ‘API User Hooks’.
User Hooks offer similar functionality to the Custom Library, but at the database rather than in the Forms front end.

Implementing API User Hooks
All the extra logic that you want to associate with APIs should be implemented as separate server–side package procedures using PL/SQL. The analysis and design of your business rules model is specific to your implementation.

After you have written and loaded into the database your server–side package, you need to associate your package with one or more specific  user hooks. There are 3 special APIs to insert, update and delete this information. To create the links between the delivered APIs and the extra logic, execute the supplied pre–processor program. This looks at
the data you have defined, the package procedure you want to call and builds logic to execute your PL/SQL from the specific user hooks. This step is provided to optimize the overall performance of API execution with user hooks. Effectively each API knows the extra logic to perform without needing to check explicitly.

To implement API user hooks:
1. Identify the APIs and user hooks where you want to attach your extra logic. See: Available User Hooks: page 12 – 35
2. Identify the data values available at the user hooks you intend to use. See: Data Values Available at User Hooks: page 12 – 39
3. Implement your extra logic in a PL/SQL server–side package procedure. See: Implementing Extra Logic in a Separate Procedure Package: page 12 – 41
4. Register your extra PL/SQL packages with the appropriate API user hooks by calling the hr_api_hook_call_api.create_api_hook_call API. Define the mapping data between the user hook and the server–side
package procedure. See: Linking Custom Procedures to User Hooks:page 12 – 43
5. Execute the user hook pre–processor program. This validates the parameters to your PL/SQL server–side package procedure and dynamically generates another package body directly into the database. This generated code contains PL/SQL to call the custom package procedures from the API user hooks

Four steps are required to implement API User Hooks:
1. Determine the API to which you need to hook extra logic
2. Write the PL/SQL procedure that executes your logic.
3. Register your procedure with one or more specific user hooks.
4. Run the pre-processor program that hooks your PL/SQL procedure to the hook(s).

What are different type of User Hooks?
Consider your validation rules in two categories:
• Data Item Rules
Rules associated with a specific field in a form or column in a table. For example, grade assigned must always be valid for the Job assigned.
• Business Process Rules
Rules associated with a specific transaction or process. For example, when you create a secondary assignment you must include a special descriptive segment value.

How to find available User Hooks in Oracle HRMS?
Oracle HRMS has several types of hooks. Only Business Process Hooks correspond to Business Process APIs. They are used when logic must extend beyond the standard business process. Only Business Process Hooks are Supported for Customer use. A list can be obtained by running the following query:

select module_name
from hr_api_modules
where api_module_type='BP';

The $PER_TOP/admin/sql/hrahkpar.sql script provides listing of each API, its hooks, and corresponding procedure parameters.Sample Hook Implementation

Get Hook Information
SELECT      ahk.api_hook_id,
ahk.api_module_id,
ahk.hook_package,
                    ahk.hook_procedure
  FROM       hr_api_hooks ahk, hr_api_modules ahm
 WHERE      ahm.module_name = 'CREATE_PERSON_ABSENCE'   -- Module Name
                    AND ahm.api_module_type = 'BP'
AND ahk.api_hook_type = 'AP'
                    AND ahk.api_module_id = ahm.api_module_id


How to register the User Hooks in Oracle HRMS?
Hook Registration
DECLARE
l_api_hook_call_id      NUMBER;
l_object_version_number NUMBER;
l_sequence              NUMBER;
BEGIN
SELECT HR_API_HOOKS_S.NEXTVAL
  INTO   l_sequence
 FROM   dual;
 Hr_Api_Hook_Call_Api.create_api_hook_call
 (p_validate                 => FALSE,
  p_effective_date           => TO_DATE('01-JAN-1952','DD-MON-YYYY'),
  p_api_hook_id              => <HOOK ID>
  p_api_hook_call_type       => 'PP',
  p_sequence                 => l_sequence,
  p_enabled_flag             => 'Y',
  p_call_package             => 'LMHR_HOOK_VALIDATIONS_PKG',    -- Custom Package
 p_call_procedure           => UPPER('Validate_Objectives_Weight'),  -- Custom procedure
 p_api_hook_call_id         => l_api_hook_call_id,
 p_object_version_number    => l_object_version_number);

  DBMS_OUTPUT.PUT_LINE('l_api_hook_call_id '|| l_api_hook_call_id);
END ;


Run the pre-processor script
        Run pre-processor script hrahkone.sql with module name as parameter (PER_TOP/admin/sql/hrahkone.sql).It compile API hook. After running this your APIhook Package should have custom procedure call.

Delete custom code from APIuser Hook

      Get Hook Call ID and Object Version_number

Select  api_hook_call_id,object_version_number from HR_API_HOOK_CALLS where call_package like ‘LMHR%’ and call_procedure = UPPER('Validate_Objectives_Weight')

Hook Deletion
BEGIN
 Hr_Api_Hook_Call_Api.delete_api_hook_call  (     p_validate         => FALSE,
                                   p_api_hook_call_id  => <Hook Call ID>,
                                   p_object_version_number =>2   );
  DBMS_OUTPUT.PUT_LINE('deleted Successfully');
END;

HR_API_HOOK_CALL_API.DELETE_API_HOOK_CALL

HR_API_HOOK_CALL_API.CREATE_API_HOOK_CALL

HR_API_HOOK_CALL_API.UPDATE_API_HOOK_CALL

Where is people data stored in Oracle HRMS?
It is stored in table named per_all_people_f

But there is a table named per_people_f too?
Per_people_f is a view on top of per_all_people_f. This view filters the list of records from  per_all_people_f.

This filtration happens in the where clause of view, based on security profile of the responsibility being used by user.

No comments:

Post a Comment

If you are facing any issues while copying the Code/Script or any issues with Posts, Please send a mail to OracleApp88@Yahoo.com or message me at @apps88 or +91 905 957 4321 in telegram.
Best Blogger TipsGet Flower Effect