I think people were having trouble understanding your question due to the table structure which is so bad it seems designed to give you a headache.As Balazs Papp indicates very little can be implemented that will scale or not look like something hacked together.
However there are solutions that can be done in PL/SQL. A pipelined table function will end up looking like a view. It will not scale to large numbers of users but won't look like a hack.
Another solution involves trading speed of execution for freshness of data. A lot of data is composed of ten percent active records and ninety percent archive data that is unlikely to change. If your data only has to be current once a day or once every few hours you could implement the pseudo code below as a packaged procedure and call a job to refresh the table.
---pseudo code, not intended to compile--
CREATE TABLE ADDRESS_BOOK(ID NUMBER(9) NOT NULL,NAME_OF_USER VARCHAR2(250) NOT NULL,SURNAME VARCHAR(100) NOT NULL,TOWN VARCHAR(100) NOT NULL);
- create a sequence to fill the ID column
- create a primary key on ID
- create a trigger to automatically insert the ID from the sequence when the value inserted is null
--create your package
CREATE PACKAGE PKG_ADDRESS_BOOKASPROCEDURE REFRESH;END PKG_ADDRESS_BOOK;CREATE OR REPLACE PACKAGE BODY PKG_ADDRESS_BOOK ISPROCEDURE REFRESHCURSOR the_tables ISselect table_name from user_tab_cols where column_name ='TOWN';BEGIN--clear out old dataDELETE FROM ADDRESS_BOOK;FOR items in the_tables LOOPEXECUTE IMMEDIATE('INSERT INTO ADDRESS_BOOK '||'SELECT null,'||items.table_name||','||'items.table_name.surname,items.table_name.town '||'FROM '||items.table_name);END LOOP;END REFRESH;END PKG_ADDRESS_BOOK;
This is all predicated on the idea the data is slowly changing. Optimizations would include adding a method to only change the changed data.