A day in the life of an Oracle Applications Consultant

Wednesday, June 16, 2004

Download Oracle AIM (Applications Implementation Methodology) Software and White Paper

Oracle AIM (Applications Implementation Methodology) is Oracle’s project management methodology. This post contains links to where you can download Oracle AIM software and an Oracle AIM White Paper.

Oracle AIM consists of a project management methodology together with the underlying documentation templates that support the tasks you perform within this methodology. This combination of a methodology together with documentation templates makes AIM a powerful tool for assisting implementation participants in running and managing projects successfully. The methodology can be used for any other software implementations but obviously the true value of AIM will be only be realised when it is used in conjunction with the Oracle specific document templates. The documentation templates are available in the AIM software and the methodology is clearly outlined in the AIM software documentation.

The Oracle AIM Front End Displaying Project Phases and Processes

The Oracle AIM GUI which allows you to navigate to the documentation templates

Download Oracle AIM Software (23.8MB). This software provides you with the methodology as well as documentation templates that are needed to run your Oracle Applications projects. Sadly this product doesn’t seem to have been updated for version 11i. Two tips I should give you about the installation and use of the tool - firstly, it doesn’t seem to work on IE6 so try and install it on IE5.5 or lower. Secondly, make sure that your macro security is set to low in both Word and Excel as AIM will need to run certain startup Macros that install additional menus. The software contains all the documentation you will need to learn how to use the product and you can also download my white paper which will give you a short overview of how AIM works.

Tuesday, June 15, 2004

Installing, Configuring and Using Oracle ADI (Applications Desktop Integrator)

Recently I wrote about configuring the ADI client to connect to the Oracle Database. I have subsequently come across an excellent site which has a page on how to Install, Configure and Use Oracle ADI (Applications Desktop Integrator). The page outlines how to: -

  • Install ADI
  • Start ADI
  • Define a Database Connection
  • Use ADI

    It can be found on the University of Virginia site here.

    Monday, June 14, 2004

    Understanding constraints (ceilings) in Oracle Public Sector Budgeting (PSB)

    The objective of this post is to explain in more detail certain concepts relating to constraints. In particular I will be looking at thresholds and severity levels and explaining how they work.

    According to the Oracle Public Sector Budgeting documentation, “Constraints are used to notify users regarding specific conditions for account ranges, elements, or position sets. For example, users can be notified if the total expense for a range of accounts exceeds a particular dollar amount.

  • Account constraints are used to prevent budget amount violations for line items.
  • Element constraints are used to prevent modification of element rates for a selected group of positions.
  • Position constraints are used to prevent element cost violations for selected positions or positions that are assigned to invalid element options”.
  • Constraints (otherwise known as ceilings in most of the Government organisations I have implemented in) are used to place limits on budget estimates. In Oracle Public Sector Budgeting, estimates are prepared in budget worksheets and after these estimates are prepared they are checked against constraints or ceilings that have been put in place by the relevant authorities.

    The screen shot below shows the constraint setup screen in Oracle PSB. 

    Setting up Ceilings/Constraints in Oracle Public Sector Budgeting

    One of the concepts I struggled to understand was severity levels and thresholds and how these worked together. I felt that the Oracle documentation was a bit weak in this area and did not clearly define how these two settings worked together. In the next couple of paragraphs I will attempt to clarify how these “parameters” work.

    Essentially two types of ceilings exist, namely: -

    Hard Ceilings (can also be referred to as absolute)

    This occurs is the threshold is less that or equal to the severity level. In the diagram shown, lines two (“General Fund”) and three (“Finance FTE”) would be classified as hard ceilings.

    If a ceiling is hard, when a budget worksheet preparer submits a worksheet for review, a constraint violation will be produced and the user will be required to amend the violation to the worksheet and then re-submit it. Essentially, the worksheet will remain stuck with the preparer until such time as he ensures the constraint violation is rectified.

    Soft Ceilings (can also be referred to as advisory)

    This occurs if the threshold > severity level. In the diagram shown, line one (“Budget Dept”) would be classified as a soft ceiling.

    In this instance, when a budget worksheet preparer submits a worksheet for review a constraint violation will be produced, however, the authoriser will still be able to work on the worksheet and post it to the General Ledger. Essentially, a warning message will be produced that there is a constraint violation but this will not stop the worksheet going through all the remaining processes needed to post it to the General Ledger.

    Note: If the severity level is left blank then it is assumed to be less than the threshold level.

    Sunday, June 13, 2004

    Legacy Management Part B

    All projects start with some level of legacy management, be it software development or ERP implementation, it involves legacy systems and their future role as business process base. The business process that is more or less embedded in legacy is of importance viz a viz proposed automation too. To begin with our projects we must develop a sustainable approach to legacy management.

    An article from the Gartner Group:” Building an IT Strategy When There is no Clear Business Strategy” (Gartner - John Mahoney) states following rules of the game:

    a. Best Practices

    1) Connect all IT proposals to business goals and metrics or to the necessary base infrastructure, if necessary, by establishing new measures. Use these connections to prioritize and measure IT proposals in business terms. Keep the model simple and flexible.
    2) Communicate frequently and clearly to ensure that all-important issues are considered and to avoid misunderstandings.
    3) Limit strategy creation to a predetermined period, and review and update the strategy later if necessary as circumstances change. It is better to have an initial direction and a sketch map so that the journey can start rather than wait until the whole journey is mapped in detail.
    4) Ensure IT planners are represented in business-strategy planning teams from the start and throughout the process.
    5) Engage top leaders in the enterprise to resolve strategic ambiguities and to manage uncertainties and risks.

    b. Common Errors

    1) Attempting to create inappropriate detail and false certainty.
    2) Building or expressing IT strategy in terms of technology drivers rather than business objectives and benefits.
    3) Failing to connect IT strategy to a business value model.
    4) Creating the impression for enterprise business leaders that the IT organization is trying to take over business planning.
    5) Spending excessive and uncontrolled time on consultation and consensus building.”
    6) Worst of all, subjecting professional and technical decisions making to non technical staff

    Bottom line—Back to Basics! Business Planning and Analysis are key messages that need to be embraced by all levels of management, and involvement is mandatory. Successful implementation of any chosen technology will only be successful if the people using the technologies are committed and trained, and management has provided a clear message on the use of the technologies. There is no magic, just plain old common sense and experience. Effective plan must be:

    1. Must be rapid process.
    2. Provide brief and clear output
    3. Integrate context to greater and detailed strategic plans.

    A. First Step: Gain Understanding of Organizational Operations.

    Our first step should be one of gaining understanding. We need understanding of all of our current assets wherever they may be, human and technical or otherwise.

    1. Understanding our human capital means we need to understand where the expertise and skills are and how they relate to our inventory of software and hardware.
    2. Who in our organization has the business expertise relating to: vital legacy applications, installed packages, new services, and Web applications?
    3. Who in our organization has the technical expertise for these assets?

    Understanding our technical software assets means:

    1. Making and verifying an inventory of all existing applications
    2. Including the technology requirements and skill sets required for each.
    3. The business functions that each application supports.

    Having this matrix of resources, applications, skill sets and human capital, we will be able to see where the most important business functions are supported and how. We need to catalogue what we have, where it is, what it does, what technology is in use, what database management systems, what data does it contain, what data does it own, how does it interact with other assets, what is the timing of events. This would generate an X-Ray of our IT activities and assets, giving us a deeper look into the overlaps and missing pieces and huts help us understand how to go forward.

    B. Second Step: Planning of Proposed Operations.

    Planning change is different from standard project plans in the sense that it necessitates inside-out approach. This will be a new kind of planning, tactical and strategic for an ever-evolving portfolio. Not a static or linear model, but a perpetual, adaptive and evolving model. Legacy management is not a one time issue or resolution, but developing a strategy of interoperability and interpolarity. A service oriented strategy to match the new model with our service oriented component architectures that we should adopt. Our project management techniques will need to plan for collaboration from many different knowledge domains—business and technical. We will need to develop and adapt to a multi-path multi-disciplinary integration methodology.

    Skills management and detailed training plans will be very important, as our assets will lie in many different technologies. Strong architectural skills will be of the essence. New tools may be required to support the detailed cataloging of the business models and the technical components that support them.

    C. Third Step: Produce Architecture for Organizational Operations.

    Perhaps the most important step will be the production of the architecture that supports our new outlook, along with the plans on how to map our current assets to it. Using new integration methodology and business modeling tools we need to develop our new business model based on this new thinking regardless of what currently exists. This leads us to go for a through upgrade of applications, rather than to adopt isolated instance of what is new; an Enterprise wide scope.

    Every asset we currently own is a possible component of our architecture. Whether it is new or legacy or a purchased application component does not matter. What matters is, it has a usable component that satisfies the business need. Our understanding of these current assets will be vital as we go forward. Our previously developed inventory can be used and we will need to dig into each application to find the real services that can be reused.

    Once we have this new architecture and can see how our current assets map to it then we need to implement. We will use a variety of techniques and tools to isolate and wrapper functionality from legacy applications preferably using non-invasive techniques. We will most likely employ AIM and middleware tools, which already have process workflow designers built into them. We will make use of wrappers and connectors to purchased solutions. And we will continue to develop new services with our choice of component and service oriented development tools. All this should be technically and professionally wrapped in any good RFP.
    Our ultimate target is an ERP with loosely coupled, workflow based, service oriented portfolio of software assets, which is flexible enough to manage rapid change over time and support new business practices as rapidly as they occur and is achievable.

    D. Legacy Component Services and Future.

    Yet it does not mean that the thousands of dollars invested in legacy applications must be wasted. Duplications stand to go. Those legacy assets can still be utilized to deliver new services meeting our time to market goals and possibly at a lower cost than totally new development. .

    E. Back to the Basics.

    Organizations today are looking to new technologies to improve their organization’s competitiveness. That’s why IT must not only support, but also shape every company’s ability to manage customer relationships and business partnerships, and to introduce innovative products, services and distribution channels. And move away from the 80’s—“isolated islands of technology”. With proper business analysis, organizations can progress cautiously, minimizing risk, and providing users with functionality they not only want, but also need! Through the business analysis process, IT infrastructure is enhanced with a clear set of “technology stakes” presented to the users, and constant review and updates to business processes manages changes in the business.

    At the core of these changes is the internet computing model; a gradual shift from client/server. These objects provide IT with the ability to quickly and accurately model business requirements for end users and provide the basic building blocks for new functionality. These basic functions are available today in many new tested technology environments from J2EE tools, ‘Self Service Modules’ and its rival, .Net. So the question is not what tools can I use, but what do I want to accomplish with these new tools? “Back to the Basics!”

    Saturday, June 12, 2004

    Configuring the ADI client to connect to the database

    This post outlines how to configure the ADI client software to connect to the database. It’s aimed at non-technical people and assumes that you have already configured the TNSNAMES.ora file.

    Assumed TNSNAMES.ORA file settings

    Typical TNSNAMES.ORA lines

    ADI database connection settings

    To get to the settings displayed below, from the ADI toolbar click on the signon button. In the Signon screen, on the bottom left hand side of the screen you should then click on the Define Basis button and configure your connection.

    An example of how to configure your ADI client database connection

    The name and description you give to the database connection can be user defined. The GWYUID and FNDNAM settings are consistent for most databases that you connect to and lastly, the connect string should be the same as the SID contained in the TNSNAMES.ORA

    Page 58 of 60 pages ‹ First  < 56 57 58 59 60 >