Stands for a standardized set of Create/Read/Update/Delete functionalities commonly seen on all administrative interfaces.
A Web crawler, sometimes called a spider or spiderbot and often shortened to crawler, is an Internet bot that systematically browses the World Wide Web and that is typically operated by search engines for the purpose of Web indexing (web spidering).
Platform actively monitors all changed made to data by various users. To quickly access detailed audit logs on changes for specific data entry, use "Audit Logs" button.
Page size
15
1 / 1
1 - 3of3