• Login
    View Item 
    •   DSpace Home
    • Lecture Papers
    • Lecture Papers International Published Articles
    • View Item
    •   DSpace Home
    • Lecture Papers
    • Lecture Papers International Published Articles
    • View Item
    JavaScript is disabled for your browser. Some features of this site may not work without it.

    Nature-based Hyperparameter Tuning of a Multilayer Perceptron Algorithm in Task Classification: A Case Study on Fear of Failure in Entrepreneurship

    Thumbnail
    View/Open
    Abstract (179.0Kb)
    Content (942.9Kb)
    Plagiarism (4.513Mb)
    Date
    2025
    Author
    Saputri, Theresia Ratih Dewi
    Kurniawan, Edwin
    Lestari, Caecillia Citra
    Antonio, Tony
    Metadata
    Show full item record
    Abstract
    Entrepreneurship plays a key role in generating economic growth, encouraging innovation, and creating job opportunities. Understanding which demographic, psychological, and socio-economic factors contribute to fear of failure in entrepreneurship is essential to developing proper standards in entrepreneurship education and policy. However, it remains challenging to accurately classify these factors, especially when balancing model performance with model complexity in a multilayer perceptron algorithm. An effective model requires the correct parameter setting via a hyperparameter tuning process. Adjusting each hyperparameter by hand requires significant effort and knowledge, as there are frequently multiple combinations to consider. Furthermore, manual tuning is prone to human error and may overlook optimal configurations, resulting in inferior model performance and prediction accuracy. This study evaluates nature-inspired optimization techniques, including particle swarm optimization (PSO), genetic algorithm (GA), and grey wolf optimization (GWO). Several parameters are tuned in the present multilayer perceptron model, including the number of hidden layers and the number of nodes in each hidden layer, learning rate, and activation functions. The used dataset which consists of 39 features from 333 samples captured individual fears, loss score, and computational efficiency as the required amount of time for finding the best parameter combination. Model accuracy performance scores are 45.16%, 53.76%, and 58.61% for GA, PSO, and GWO, respectively. Meanwhile their execution time are 10 minutes, 27 minutes, and 23 minutes, for GA, PSO, and GWO, respectively. Experiment results further reveal that each optimization algorithm has distinct advantages: GA excels at speedy convergence, PSO provides a robust exploration of hyperparameter space, and GWO offers remarkable adaptability to complicated parameter interdependencies. This study provides empirical evidence for the efficacy of nature-inspired hyperparameter modification in improving multilayer perceptron performance for fear of failure categorization tasks.
    URI
    https://dspace.uc.ac.id/handle/123456789/8169
    Collections
    • Lecture Papers International Published Articles

    Copyright©  2017 - LPPM & Library Of Universitas Ciputra
    »»» UC Town CitraLand, Surabaya - Indonesia 60219 «««
    Powered by : FreeBSD | DSpace | Atmire
     

     

    Browse

    All of DSpaceCommunities & CollectionsBy Issue DateAuthorsTitlesSubjectsThis CollectionBy Issue DateAuthorsTitlesSubjects

    My Account

    Login

    Copyright©  2017 - LPPM & Library Of Universitas Ciputra
    »»» UC Town CitraLand, Surabaya - Indonesia 60219 «««
    Powered by : FreeBSD | DSpace | Atmire