iconLogo
Published:2026/1/5 12:47:43

オープンサイエンス、SEでどうなの?調査結果が出たよ☆(超要約:再現性、課題多め!)

1. ギャル的キラキラポイント✨

  • ● SE(ソフトウェアエンジニアリング)分野の論文を対象にしてるのが、ちょいとオタクっぽいけど面白いじゃん?
  • ● 実際に論文のコードとかを動かして、再現性(研究結果がホントか)を調べてるのが、真面目でエラい!
  • ● IT企業が使えるように、課題と解決策を教えてくれるのが、マジで神🥺

2. 詳細解説

  • 背景 世の中、研究結果をオープンにする動きが進んでる🙌 でもSE分野、ちゃんと再現されてるの?って疑問があったんだよね🤔 みんなが作ったプログラムとか、ちゃんと動くか調べないと、意味ないじゃん?
  • 方法 過去10年分のICSE論文から100個のレプリケーションパッケージ(研究で使ったプログラムとか)をゲット!実際に動かして、動くか、再現できるか、何が邪魔してるのかを調べたんだって!
  • 結果 えーっと、動かないとか、再現できないのが結構あったみたい😢 環境設定が難しかったり、説明が足りなかったり…もっと改善の余地あり!ってことみたい🥺
  • 意義(ここがヤバい♡ポイント) IT企業にとって、研究の信頼性UPは超重要!この研究結果を参考に、再現性を高める努力をすれば、AIの品質保証とか、開発効率UPとか、イイコトいっぱいじゃん?🤩

続きは「らくらく論文」アプリで

The State of Open Science in Software Engineering Research: A Case Study of ICSE Artifacts

Al Muttakin / Saikat Mondal / Chanchal Roy

Replication packages are crucial for enabling transparency, validation, and reuse in software engineering (SE) research. While artifact sharing is now a standard practice and even expected at premier SE venues such as ICSE, the practical usability of these replication packages remains underexplored. In particular, there is a marked lack of studies that comprehensively examine the executability and reproducibility of replication packages in SE research. In this paper, we aim to fill this gap by evaluating 100 replication packages published as part of ICSE proceedings over the past decade (2015--2024). We assess the (1) executability of the replication packages, (2) efforts and modifications required to execute them, (3) challenges that prevent executability, and (4) reproducibility of the original findings. We spent approximately 650 person-hours in total executing the artifacts and reproducing the study findings. Our findings reveal that only 40\% of the 100 evaluated artifacts were executable, of which 32.5\% (13 out of 40) ran without any modification. Regarding effort levels, 17.5\% (7 out of 40) required low effort, while 82.5\% (33 out of 40) required moderate to high effort to execute successfully. We identified five common types of modifications and 13 challenges leading to execution failure, spanning environmental, documentation, and structural issues. Among the executable artifacts, only 35\% (14 out of 40) reproduced the original results. These findings highlight a notable gap between artifact availability, executability, and reproducibility. Our study proposes three actionable guidelines to improve the preparation, documentation, and review of research artifacts, thereby strengthening the rigor and sustainability of open science practices in SE research.

cs / cs.SE