Population fails for property with type d:datetime Oracle & PostgreSQL


Steps to reproduce:
1. Import the attached model (go to Admin Tools -> Model Manager -> Import Model)
2. Activate the model (Actions -> Activate)
3. Create a new file and add model1:aspect1 (Manage aspects -> add model1:aspect1)
4. Create the attached query set
5. Click refresh button to start the population

Error on postgres:

Error on oracle:




Bruno Bossola
April 1, 2021, 11:02 AM

Note: in order to successfully test this scenario with the provided files you will also need to:

  • create another node and assign it the “dublincore” aspect

  • update ALL its properties adding some dummy values

This is required as the queryset requests the “dc:coverage” attribute and, unless at least one node is populated with a value there, the QNAME won’t appear in the alf_qname table and the population will always fail.

Bruno Bossola
April 1, 2021, 8:54 AM

New PR:

Bruno Bossola
March 26, 2021, 9:28 AM

The decision was to move the data type to VARCHAR in the denorm. This will resolve the population problems and will also guarantee that the same logic used when querying in accelerated and non-accelerated modes.

Bruno Bossola
March 23, 2021, 2:49 PM

Ok, so we have a situation in regards to Date/Timestamp/Datetime that I believe should be solved in a different way compared to the way I did so far. At the moment Alfresco stores Date&co as strings, using the "string_value" field of the node properties table. The denormalised table, however, creates fields of type sql.timestamp (this may vary for some database) and this causes an issue during population. I believe that this can cause also an issue during any query, as the search algorithm will behave fundamentally differently (edited) 

I believe we should simply change the table creation SQL, defining "string" (varchar) fields for  date&co, as it's done for node properties. This will sort out the population issue we see now (the population on Oracle still fails, as "stringified" dates cannot be interpreted back to "timestamp" values) and potentially any query issues we may have and we do not know yet. (edited)

Not having however historical memory to this regards, I'd like to discuss this with the team.





Adina Ababei





Bug Priority

Category 3

Delivery Team

Team 5

Release Train



Affects versions