2
the gas system provided on miners’ perceived trust, safety,
and decision-making, ultimately providing insights that
may enhance the effectiveness of these systems in high-risk
environments.
LITERATURE REVIEW
Theory of Trust in Machine
The Theory of Trust in Machines examines how trust devel-
ops in automated systems, drawing on frameworks like the
Theory of Planned Behavior (TPB) [8] and the Technology
Acceptance Model (TAM) [9]. This theory emphasizes that
factors such as machine reliability, performance, and trans-
parency are essential for predicting user trust [10]. Both
cognitive trust, based on rational assessment, and affective
trust, rooted in emotional connection, play key roles in
human-machine interactions.
User trust in machines is shaped by system design, user
experience, and context. Research shows that incorporat-
ing human-like features enhances user trust [11], and that
transparency, explainability, and feedback mechanisms
further build trust by making machine decisions more
understandable [12]. Environmental and social factors like
organizational culture, social norms, and individual differ-
ences also significantly influence machine trust.
Trustworthy AI
AI systems operate with a level of autonomy that enables
them to make decisions and predictions independently, set-
ting them apart from traditional technologies where users
maintain direct control. This autonomy, particularly in
complex “black box” machine learning algorithms, can cre-
ate uncertainties for users, making trust in AI essential [13].
To support this, frameworks like FATE: focused on fair-
ness, accountability, transparency, and explainability, have
been developed to enhance trust by promoting fairness and
clarity in AI operations. Research shows that these princi-
ples significantly improve user trust and overall experience
with AI systems [14]. Additionally, giving AI human-like
qualities can, in some contexts, reinforce trust, though its
effectiveness varies by application [15].
Trust and Use of AI Technology
Trust in AI technology is complex and influenced by factors
like perceived usefulness, ease of use, and credibility, which
help establish initial trust [16, 17]. However, perceived
risks such as job displacement and bias can undermine it
[18, 19]. Transparency and explainability are essential, as
users often want insight into AI decision-making processes
[12, 20]. For effective human-AI collaboration, clear roles
and responsibilities are also critical [21].
Research highlights the value of user-centered design
that addresses risk concerns and promotes AI literacy to
build trust and encourage adoption. Designers should
focus on transparency, explainability, and accountability to
mitigate perceived risks [22]. Trust in AI also varies by con-
text, with factors like personality, experience, and cultural
background shaping AI acceptance [23]. Future research
should examine how trust evolves over time, the influence
of contextual factors, and the role of diversity and inclusion
to better understand trust dynamics in AI technology.
Multidimensional Approach to Trust
A multidimensional approach to trust recognizes its com-
plexity, encompassing several key dimensions. Research
highlights cognitive trust (based on perceived competence
and reliability), affective trust (emotional connection and
empathy), and conative trust (intentional trust and com-
mitment) as essential components [17, 24]. Cognitive trust
is grounded in rational evaluation, while affective trust cen-
ters on emotional bonds and shared values [25]. Conative
trust reflects a commitment and willingness to rely on
others.
Studies show that this multidimensional view enriches
our understanding of trust across various contexts. For
example, interpersonal trust is notably different from insti-
tutional trust [26], while organizational trust includes
dimensions like loyalty, commitment, and communication.
Trust in technology relies on factors such as system reliabil-
ity, data security, and user experience [27].
Social and Demographic Characteristics on Trust
Social and demographic characteristics play a significant
role in shaping trust dynamics. Studies show that age
affects trust, with older adults generally exhibiting lower
levels of trust, likely due to life experiences and cognitive
changes [28]. Gender differences also emerge, with women
often displaying lower trust levels than men, influenced by
societal and cultural factors [29]. Education fosters critical
thinking, making those with higher education levels more
skeptical and selective in their trust.
Income and socioeconomic status further influence
trust perceptions, with individuals from higher socioeco-
nomic backgrounds generally displaying higher trust levels,
likely due to greater access to resources and social networks
[30]. Ethnicity and cultural background also matter, collec-
tivist cultures tend to emphasize interpersonal trust, while
individualist cultures prioritize institutional trust [31].
Urban-rural differences exist as well, with rural residents
often exhibiting stronger community trust due to close-
knit ties [32].
Previous Page Next Page